diff --git a/docs/UsersGuide/source/BuildRunSRW.rst b/docs/UsersGuide/source/BuildRunSRW.rst index 063ab79e98..629210ec27 100644 --- a/docs/UsersGuide/source/BuildRunSRW.rst +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -20,8 +20,7 @@ The overall procedure for generating an experiment is shown in :numref:`Figure % * :ref:`Install prerequisites ` * :ref:`Clone the SRW App from GitHub ` * :ref:`Check out the external repositories ` - * :ref:`Set up the build environment ` - * :ref:`Build the executables ` + * :ref:`Set up the build environment and build the executables ` * :ref:`Download and stage data ` * :ref:`Optional: Configure a new grid ` * :ref:`Generate a regional workflow experiment ` @@ -50,11 +49,21 @@ Install the HPC-Stack Background ---------------- -The UFS Weather Model draws on over 50 code libraries to run its applications. These libraries range from libraries developed in-house at NOAA (e.g. NCEPLIBS, FMS, etc.) to libraries developed by NOAA's partners (e.g. PIO, ESMF, etc.) to truly third party libraries (e.g. NETCDF). Individual installation of these libraries is not practical, so the `HPC-Stack `__ was developed as a central installation system to ensure that the infrastructure environment across multiple platforms is as similar as possible. Installation of the HPC-Stack is required to run the SRW App. +The UFS Weather Model draws on over 50 code libraries to run its applications. These libraries range from libraries developed in-house at NOAA (e.g., NCEPLIBS, FMS) to libraries developed by NOAA's partners (e.g., PIO, ESMF) to truly third party libraries (e.g., NETCDF). Individual installation of these libraries is not practical, so the `HPC-Stack `__ was developed as a central installation system to ensure that the infrastructure environment across multiple platforms is as similar as possible. Installation of the HPC-Stack is required to run the SRW App. Instructions ------------------------- -Users working on systems that fall under `Support Levels 2-4 `_ will need to install the HPC-Stack the first time they try to build applications (such as the SRW App) or models that depend on it. Users can either build the HPC-stack on their local system or use the centrally maintained stacks on each HPC platform if they are working on a Level 1 system. For a detailed description of installation options, see :ref:`Installing the HPC-Stack `. +Users working on systems that fall under `Support Levels 2-4 `_ will need to install the HPC-Stack the first time they try to build applications (such as the SRW App) or models that depend on it. Users can either build the HPC-Stack on their local system or use the centrally maintained stacks on each HPC platform if they are working on a Level 1 system. Before installing the HPC-Stack, users on both Linux and MacOS systems should set the stack size to "unlimited" (if allowed) or to the largest possible value: + +.. code-block:: console + + # Linux, if allowed + ulimit -s unlimited + + # MacOS, this corresponds to 65MB + ulimit -S -s unlimited + +For a detailed description of installation options, see :ref:`Installing the HPC-Stack `. After completing installation, continue to the next section. @@ -114,7 +123,7 @@ The cloned repository contains the configuration files and sub-directories shown Check Out External Components ================================ -The SRW App relies on a variety of components (e.g., regional_workflow, UFS_UTILS, ufs-weather-model, and UPP) detailed in :numref:`Chapter %s ` of this User's Guide. Users must run the ``checkout_externals`` script to link the necessary external repositories to the SRW App. The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top level directory of the SRW App to clone the correct tags (code versions) of the external repositories listed in :numref:`Section %s ` into the appropriate directories under the ``regional_workflow`` and ``src`` directories. +The SRW App relies on a variety of components (e.g., regional_workflow, UFS_UTILS, ufs-weather-model, and UPP) detailed in :numref:`Chapter %s ` of this User's Guide. Each component has its own :term:`repository`. Users must run the ``checkout_externals`` script to collect the individual components of the SRW App from their respective git repositories. The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top level directory of the SRW App to clone the correct tags (code versions) of the external repositories listed in :numref:`Section %s ` into the appropriate directories under the ``regional_workflow`` and ``src`` directories. Run the executable that pulls in SRW App components from external repositories: @@ -123,83 +132,38 @@ Run the executable that pulls in SRW App components from external repositories: cd ufs-srweather-app ./manage_externals/checkout_externals +The script should output dialogue indicating that it is retrieving different code repositories. It may take several minutes to download these repositories. +.. _BuildExecutables: -Build with ``devbuild.sh`` -========================== - -On Level-1 systems, for which a modulefile is provided under ``modulefiles`` directory, we can build SRW App binaries with: - -.. code-block:: console - - ./devbuild.sh --platform=hera - -If compiler auto-detection fails for some reason, specify it using - -.. code-block:: console - - ./devbuild.sh --platform=hera --compiler=intel - -If this method doesn't work, we will have to manually setup the environment, and build SRW app binaries with CMake. - -.. _SetUpBuild: +Set Up the Environment and Build the Executables +=================================================== -Set up the Build/Run Environment -================================ +.. _DevBuild: -We need to setup our environment to run a workflow or to build the SRW app with CMake. Note that ``devbuild.sh`` does not prepare environment for workflow runs so this step is necessary even though binaries are built properly using ``devbuild.sh``. +``devbuild.sh`` Approach +----------------------------- -The build environment must be set up for the user's specific platform. First, we need to make sure ``Lmod`` is the app used for loading modulefiles. That is often the case on most systems, however, on some systems such as Gaea/Odin, the default modulefile loader is from Cray and we need to swap it for ``Lmod``. For example on Gaea, assuming a ``bash`` login shell, run: +On Level 1 systems for which a modulefile is provided under the ``modulefiles`` directory, we can build the SRW App binaries with: .. code-block:: console - source etc/lmod-setup.sh gaea + ./devbuild.sh --platform= -or if your login shell is ``csh`` or ``tcsh``, source ``etc/lmod-setup.csh`` instead. If you execute the above command on systems that don't need it, it will simply do a ``module purge``. From here on, we can assume, ``Lmod`` is ready to load modulefiles needed by the SRW app. +where ```` is replaced with the name of the platform the user is working on. Valid values are: ``cheyenne`` | ``gaea`` | ``hera`` | ``jet`` | ``macos`` | ``odin`` | ``orion`` | ``singularity`` | ``wcoss_dell_p3`` -The modulefiles needed for building and running SRW App are located in ``modulefiles`` directory. To load the necessary modulefile for a specific ```` using ```` , run: +If compiler auto-detection fails for some reason, specify it using the ``--compiler`` argument. FOr example: .. code-block:: console - module use - module load build__ - -where ```` is the full path to the ``modulefiles`` directory. This will work on Level 1 systems, where a modulefile is available in the ``modulefiles`` directory. - -On Level 2-4 systems, users will need to modify certain environment variables, such as the path to NCEP libraries, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build__`` modulefiles can be copied and used as a template. To check whether Lmod is installed, run ``echo $LMOD_PKG``, and see if it outputs a path to the Lmod package. On systems without Lmod, users can modify or set the required environment variables with the ``export`` or ``setenv`` commands despending on whether they are using a bash or csh/tcsh shell, respectively: - -.. code-block:: - - export = - setenv - - -.. _BuildExecutables: - -Build the Executables -======================= - -Create a directory to hold the build's executables: - -.. code-block:: console - - mkdir build - cd build - -From the build directory, run the following commands to build the pre-processing utilities, forecast model, and post-processor: - -.. code-block:: console - - cmake .. -DCMAKE_INSTALL_PREFIX=.. - make -j 4 >& build.out & + ./devbuild.sh --platform=hera --compiler=intel -``-DCMAKE_INSTALL_PREFIX`` specifies the location in which the ``bin``, ``include``, ``lib``, and ``share`` directories will be created. These directories will contain various components of the SRW App. Its recommended value ``..`` denotes one directory up from the build directory. In the next line, the ``make`` call argument ``-j 4`` indicates that the build will run in parallel with 4 threads. +where valid values are ``intel`` or ``gnu``. -The build will take a few minutes to complete. When it starts, a random number is printed to the console, and when it is done, a ``[1]+ Done`` message is printed to the console. ``[1]+ Exit`` indicates an error. Output from the build will be in the ``ufs-srweather-app/build/build.out`` file. When the build completes, users should see the forecast model executable ``ufs_model`` and several pre- and post-processing executables in the ``ufs-srweather-app/bin`` directory. These executables are described in :numref:`Table %s `. +The last line of the console output should be ``[100%] Built target ufs-weather-model``, indicating that the UFS Weather Model executable has been built successfully. -.. hint:: +The executables listed in :numref:`Table %s ` should appear in the ``ufs-srweather-app/bin`` directory. If this build method doesn't work, or it users are not on a supported machine, they will have to manually setup the environment and build the SRW App binaries with CMake as described in :numref:`Section %s `. - If you see the build.out file, but there is no ``ufs-srweather-app/bin`` directory, wait a few more minutes for the build to complete. .. _ExecDescription: @@ -258,19 +222,86 @@ The build will take a few minutes to complete. When it starts, a random number i | | grid point. | +------------------------+---------------------------------------------------------------------------------+ +.. _CMakeApproach: + +CMake Approach +----------------- + +Set Up the Workflow Environment +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +.. attention:: + If users successfully built the executables in :numref:`Step %s `, they should skip to step :numref:`Step %s `. + +If the ``devbuild.sh`` approach failed, users need to set up their environment to run a workflow on their specific platform. First, users should make sure ``Lmod`` is the app used for loading modulefiles. This is the case on most Level 1 systems; however, on systems such as Gaea/Odin, the default modulefile loader is from Cray and must be switched to Lmod. For example, on Gaea, assuming a ``bash`` login shell, run: + +.. code-block:: console + + source etc/lmod-setup.sh gaea + +or if the login shell is ``csh`` or ``tcsh``, run ``source etc/lmod-setup.csh`` instead. If users execute the above command on systems that don't need it, it will not cause any problems (it will simply do a ``module purge``). From here on, ``Lmod`` is ready to load the modulefiles needed by the SRW App. These modulefiles are located in ``modulefiles`` directory. To load the necessary modulefile for a specific ```` using ````, run: + +.. code-block:: console + + module use + module load build__ + +where ```` is the full path to the ``modulefiles`` directory. This will work on Level 1 systems, where a modulefile is available in the ``modulefiles`` directory. + +On Level 2-4 systems, users will need to modify certain environment variables, such as the path to HPC-Stack, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build__`` modulefiles can be copied and used as a template. To check whether Lmod is installed, run ``echo $LMOD_PKG``, and see if it outputs a path to the Lmod package. On systems without Lmod, users can modify or set the required environment variables with the ``export`` or ``setenv`` commands despending on whether they are using a bash or csh/tcsh shell, respectively: + +.. code-block:: + + export = + setenv + +.. + COMMENT: Might be good to list an example here... + +.. _BuildCMake: + +Build the Executables Using CMake +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +.. attention:: + If users successfully built the executables in :numref:`Step %s `, they should skip to step :numref:`Step %s `. + +In the ``ufs-srweather-app`` directory, create a subdirectory to hold the build's executables: + +.. code-block:: console + + mkdir build + cd build + +From the build directory, run the following commands to build the pre-processing utilities, forecast model, and post-processor: + +.. code-block:: console + + cmake .. -DCMAKE_INSTALL_PREFIX=.. + make -j 4 >& build.out & + +``-DCMAKE_INSTALL_PREFIX`` specifies the location in which the ``bin``, ``include``, ``lib``, and ``share`` directories will be created. These directories will contain various components of the SRW App. Its recommended value ``..`` denotes one directory up from the build directory. In the next line, the ``make`` call argument ``-j 4`` indicates that the build will run in parallel with 4 threads. Although users can specify a larger or smaller number of threads (e.g., ``-j8``, ``-j2``), it is highly recommended to use at least 4 parallel threads to prevent overly long installation times. + +The build will take a few minutes to complete. When it starts, a random number is printed to the console, and when it is done, a ``[1]+ Done`` message is printed to the console. ``[1]+ Exit`` indicates an error. Output from the build will be in the ``ufs-srweather-app/build/build.out`` file. When the build completes, users should see the forecast model executable ``ufs_model`` and several pre- and post-processing executables in the ``ufs-srweather-app/bin`` directory. These executables are described in :numref:`Table %s `. + +.. hint:: + + If you see the build.out file, but there is no ``ufs-srweather-app/bin`` directory, wait a few more minutes for the build to complete. + + .. _Data: Download and Stage the Data ============================ -The SRW App requires input files to run. These include static datasets, initial and boundary conditions files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW App tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :numref:`Section %s Downloading and Staging Input Data `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW App. +The SRW App requires input files to run. These include static datasets, initial and boundary conditions files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW App tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in :numref:`Section %s Downloading and Staging Input Data `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW App. .. _GridSpecificConfig: Grid Configuration ======================= -The SRW App officially supports three different predefined grids as shown in :numref:`Table %s `. The "out-of-the-box" SRW App case uses the ``RRFS_CONUS_25km`` predefined grid option. More information on the predefined and user-generated grid options can be found in :numref:`Chapter %s ` for those who are curious. Users who plan to utilize one of the three pre-defined domain (grid) options may continue to :numref:`Step %s `. Users who plan to create a new domain should refer to :numref:`Chapter %s ` for details on how to do so. At a minimum, these users will need to add the new grid name to the ``valid_param_vals`` script and add the corresponding grid-specific parameters in the ``set_predef_grid_params`` script. +The SRW App officially supports four different predefined grids as shown in :numref:`Table %s `. The "out-of-the-box" SRW App case uses the ``RRFS_CONUS_25km`` predefined grid option. More information on the predefined and user-generated grid options can be found in :numref:`Chapter %s ` for those who are curious. Users who plan to utilize one of the four predefined domain (grid) options may continue to :numref:`Step %s `. Users who plan to create a new domain should refer to :numref:`Chapter %s ` for details on how to do so. At a minimum, these users will need to add the new grid name to the ``valid_param_vals`` script and add the corresponding grid-specific parameters in the ``set_predef_grid_params`` script. .. _PredefinedGrids: @@ -285,6 +316,8 @@ The SRW App officially supports three different predefined grids as shown in :nu +----------------------+-------------------+--------------------------------+ | RRFS_CONUS_3km | ESG grid | lambert_conformal | +----------------------+-------------------+--------------------------------+ + | SUBCONUS_Ind_3km | ESG grid | lambert_conformal | + +----------------------+-------------------+--------------------------------+ .. _GenerateForecast: @@ -535,11 +568,11 @@ The default settings in this file include a predefined 25-km :term:`CONUS` grid Next, edit the new ``config.sh`` file to customize it for your machine. At a minimum, change the ``MACHINE`` and ``ACCOUNT`` variables; then choose a name for the experiment directory by setting ``EXPT_SUBDIR``. If you have pre-staged the initialization data for the experiment, set ``USE_USER_STAGED_EXTRN_FILES="TRUE"``, and set the paths to the data for ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS``. -Sample settings are indicated below for Level 1 platforms. Detailed guidance applicable to all systems can be found in :numref:`Chapter %s: Configuring the Workflow `, which discusses each variable and the options available. Additionally, information about the three predefined Limited Area Model (LAM) Grid options can be found in :numref:`Chapter %s: Limited Area Model (LAM) Grids `. +Sample settings are indicated below for Level 1 platforms. Detailed guidance applicable to all systems can be found in :numref:`Chapter %s: Configuring the Workflow `, which discusses each variable and the options available. Additionally, information about the four predefined Limited Area Model (LAM) Grid options can be found in :numref:`Chapter %s: Limited Area Model (LAM) Grids `. .. important:: - If you set up the build environment with the GNU compiler in :numref:`Section %s `, you will have to check that the line ``COMPILER="gnu"`` appears in the ``config.sh`` file. + If your modulefile uses a GNU compiler to set up the build environment in :numref:`Section %s `, you will have to check that the line ``COMPILER="gnu"`` appears in the ``config.sh`` file. .. hint:: @@ -555,45 +588,49 @@ Minimum parameter settings for running the out-of-the-box SRW App case on Level ACCOUNT="" EXPT_SUBDIR="" USE_USER_STAGED_EXTRN_FILES="TRUE" - EXTRN_MDL_SOURCE_BASEDIR_ICS="/glade/p/ral/jntp/UFS_SRW_app/staged_extrn_mdl_files" - EXTRN_MDL_SOURCE_BASEDIR_LBCS="/glade/p/ral/jntp/UFS_SRW_app/staged_extrn_mdl_files" + EXTRN_MDL_SOURCE_BASEDIR_ICS="/glade/p/ral/jntp/UFS_SRW_App/develop/input_model_data///" + EXTRN_MDL_SOURCE_BASEDIR_LBCS="/glade/p/ral/jntp/UFS_SRW_App/develop/input_model_data///" + +where: +* refers to a subdirectory such as "FV3GFS" or "HRRR" containing the experiment data. +* refers to one of 3 possible data formats: ``grib2``, ``nemsio``, or ``netcdf``. +* YYYYMMDDHH refers to a subdirectory containing data for the :term:`cycle` date. + **Hera, Jet, Orion, Gaea:** -The ``MACHINE``, ``ACCOUNT``, and ``EXPT_SUBDIR`` settings are the same as for Cheyenne, except that ``"cheyenne"`` should be switched to ``"hera"``, ``"jet"``, ``"orion"``, or ``"gaea"``, respectively. Set ``USE_USER_STAGED_EXTRN_FILES="TRUE"``, but replace the file paths to Cheyenne's data with the file paths for the correct machine. ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` use the same file path. +The ``MACHINE``, ``ACCOUNT``, and ``EXPT_SUBDIR`` settings are the same as for Cheyenne, except that ``"cheyenne"`` should be switched to ``"hera"``, ``"jet"``, ``"orion"``, or ``"gaea"``, respectively. Set ``USE_USER_STAGED_EXTRN_FILES="TRUE"``, but replace the file paths to Cheyenne's data with the file paths for the correct machine. ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` use the same base file path. On Hera: .. code-block:: console - "/scratch2/BMC/det/UFS_SRW_app/v1p0/model_data" + "/scratch2/BMC/det/UFS_SRW_App/develop/input_model_data///YYYYMMDDHH/" On Jet: .. code-block:: console - "/lfs4/BMC/wrfruc/FV3-LAM/model_data" + "/mnt/lfs4/BMC/wrfruc/UFS_SRW_App/develop/input_model_data///YYYYMMDDHH/" On Orion: .. code-block:: console - "/work/noaa/fv3-cam/UFS_SRW_app/v1p0/model_data" - + "/work/noaa/fv3-cam/UFS_SRW_App/develop/input_model_data///YYYYMMDDHH/" On Gaea: .. code-block:: console - "/lustre/f2/pdata/esrl/gsd/ufs/ufs-srw-release-v1.0.0/staged_extrn_mdl_files" - + "/lustre/f2/pdata/ncep/UFS_SRW_App/develop/input_model_data///YYYYMMDDHH/" For **WCOSS** systems, edit ``config.sh`` with these WCOSS-specific parameters, and use a valid WCOSS project code for the account parameter: .. code-block:: console MACHINE="wcoss_cray" or MACHINE="wcoss_dell_p3" - ACCOUNT="my_account" + ACCOUNT="valid_wcoss_project_code" EXPT_SUBDIR="my_expt_name" USE_USER_STAGED_EXTRN_FILES="TRUE" @@ -601,31 +638,23 @@ For WCOSS_DELL_P3: .. code-block:: console - EXTRN_MDL_SOURCE_BASEDIR_ICS="/gpfs/dell2/emc/modeling/noscrub/UFS_SRW_App/model_data" - EXTRN_MDL_SOURCE_BASEDIR_LBCS="/gpfs/dell2/emc/modeling/noscrub/UFS_SRW_App/model_data" - -For WCOSS_CRAY: - -.. code-block:: console - - EXTRN_MDL_SOURCE_BASEDIR_ICS="/gpfs/hps3/emc/meso/noscrub/UFS_SRW_App/model_data" - EXTRN_MDL_SOURCE_BASEDIR_LBCS="/gpfs/hps3/emc/meso/noscrub/UFS_SRW_App/model_data" - + EXTRN_MDL_SOURCE_BASEDIR_ICS="/gpfs/dell2/emc/modeling/noscrub/UFS_SRW_App/develop/model_data///YYYYMMDDHH/ICS" + EXTRN_MDL_SOURCE_BASEDIR_LBCS="/gpfs/dell2/emc/modeling/noscrub/UFS_SRW_App/develop/input_model_data///YYYYMMDDHH/LBCS" **NOAA Cloud Systems:** .. code-block:: console - MACHINE="SINGULARITY" + MACHINE="NOAACLOUD" ACCOUNT="none" EXPT_SUBDIR="" EXPT_BASEDIR="lustre/$USER/expt_dirs" COMPILER="gnu" USE_USER_STAGED_EXTRN_FILES="TRUE" - EXTRN_MDL_SOURCE_BASEDIR_ICS="/contrib/EPIC/model_data/FV3GFS" - EXTRN_MDL_FILES_ICS=( "gfs.pgrb2.0p25.f000" ) - EXTRN_MDL_SOURCE_BASEDIR_LBCS="/contrib/EPIC/model_data/FV3GFS" - EXTRN_MDL_FILES_LBCS=( "gfs.pgrb2.0p25.f006" "gfs.pgrb2.0p25.f012" ) + EXTRN_MDL_SOURCE_BASEDIR_ICS="/contrib/EPIC/UFS_SRW_App/develop/input_model_data/FV3GFS" + EXTRN_MDL_FILES_ICS=( "gfs.t18z.pgrb2.0p25.f000" ) + EXTRN_MDL_SOURCE_BASEDIR_LBCS="/contrib/EPIC/UFS_SRW_App/develop/input_model_data/FV3GFS" + EXTRN_MDL_FILES_LBCS=( "gfs.t18z.pgrb2.0p25.f006" "gfs.t18z.pgrb2.0p25.f012" ) .. note:: @@ -692,7 +721,9 @@ The workflow requires Python 3 with the packages 'PyYAML', 'Jinja2', and 'f90nml .. code-block:: console + module use module load wflow_ + conda activate regional_workflow This command will activate the ``regional_workflow`` conda environment. The user should see ``(regional_workflow)`` in front of the Terminal prompt at this point. If this is not the case, activate the regional workflow from the ``ush`` directory by running: @@ -886,13 +917,19 @@ In addition to the baseline tasks described in :numref:`Table %s `. There are two main ways to run the workflow with Rocoto: (1) with the ``launch_FV3LAM_wflow.sh`` script, and (2) by manually calling the ``rocotorun`` command. Users can also automate the workflow using a crontab. + +.. attention:: + + If users are running the SRW App in a container or on a system that does not have Rocoto installed (e.g., `Level 3 & 4 `__ systems, such as MacOS), they should follow the process outlined in :numref:`Section %s ` instead of the instructions in this section. + +The information in this section assumes that Rocoto is available on the desired platform. All official HPC platforms for the UFS SRW App release make use of the Rocoto workflow management software for running experiments. However, Rocoto cannot be used when running the workflow within a container. If Rocoto is not available, it is still possible to run the workflow using stand-alone scripts according to the process outlined in :numref:`Section %s `. There are two main ways to run the workflow with Rocoto: (1) with the ``launch_FV3LAM_wflow.sh`` script, and (2) by manually calling the ``rocotorun`` command. Users can also automate the workflow using a crontab. + +.. note:: + Users may find it helpful to review :numref:`Chapter %s ` to gain a better understanding of Rocoto commands and workflow management before continuing, but this is not required to run the experiment. Optionally, an environment variable can be set to navigate to the ``$EXPTDIR`` more easily. If the login shell is bash, it can be set as follows: diff --git a/docs/UsersGuide/source/CompleteTests.csv b/docs/UsersGuide/source/CompleteTests.csv new file mode 100644 index 0000000000..28cc46162f --- /dev/null +++ b/docs/UsersGuide/source/CompleteTests.csv @@ -0,0 +1,28 @@ +Grid,ICS,LBCS,Suite,Date,Time (UTC),Script Name,Test Type +RRFS_CONUS_3km,FV3GFS,FV3GFS,GFS_v16,2019-07-01,00,config.grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh,Complete +RRFS_CONUS_25km,HRRR,RAP,RRFS_v1beta,2020-08-10,00,config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_HRRR.sh,Complete +RRFS_CONUS_13km,HRRR,RAP,RRFS_v1beta,2020-08-01,00,config.grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh,Complete +RRFS_CONUS_3km,HRRR,RAP,RRFS_v1beta,2020-08-01,00,config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh,Complete +RRFS_CONUS_25km,HRRR,RAP,HRRR,2020-08-10,00,config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_HRRR.sh,Complete +RRFS_CONUS_13km,HRRR,RAP,HRRR,2020-08-10,00,config.grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_HRRR.sh,Complete +RRFS_CONUS_3km,HRRR,RAP,HRRR,2020-08-10,00,config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_HRRR.sh,Complete +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-01,"00,12",config.community_ensemble_008mems.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-01,"00,12",config.community_ensemble_2mems.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-02,"00,12",config.community_ensemble_008mems.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-02,"00,12",config.community_ensemble_2mems.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-07-01,00,config.deactivate_tasks.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-07-01,00,config.inline_post.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-06-15,00,config.MET_ensemble_verification.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-06-15,00,config.MET_verification.sh,Complete/wflow +ESGgrid,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp_regional,2019-07-01,00,config.new_ESGgrid.sh,Complete/wflow +GFDLgrid,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-01,00,config.new_GFDLgrid.sh,Complete/wflow +GFDLgrid,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-01,00,config.new_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_FALSE.sh,Complete/wflow +GFDLgrid,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-01,00,config.new_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_TRUE.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-07-01,00,config.pregen_grid_orog_sfc_climo.sh,Complete/wflow +RRFS_CONUS_25km,GSMGFS,GSMGFS,FV3_GFS_2017_gfdlmp,2019-05-20,00,config.specify_DOT_OR_USCORE.sh,Complete/wflow +RRFS_CONUScompact_25km,HRRR,RAP,FV3_HRRR,2020-08-01,00,config.specify_DT_ATMOS_LAYOUT_XY_BLOCKSIZE.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2021-06-03,06,config.specify_EXTRN_MDL_SYSBASEDIR_ICS_LBCS.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-07-01,00,config.specify_RESTART_INTERVAL.sh,Complete/wflow +RRFS_CONUScompact_25km,HRRR,RAP,FV3_RRFS_v1beta,2020-08-10,00,config.subhourly_post_ensemble_2mems.sh,Complete/wflow +RRFS_CONUScompact_25km,HRRR,RAP,FV3_RRFS_v1beta,2020-08-10,00,config.subhourly_post.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-07-01,00,config.specify_template_filenames.sh,Complete/wflow \ No newline at end of file diff --git a/docs/UsersGuide/source/CompleteTests.rst b/docs/UsersGuide/source/CompleteTests.rst new file mode 100644 index 0000000000..50cf6f69fb --- /dev/null +++ b/docs/UsersGuide/source/CompleteTests.rst @@ -0,0 +1,8 @@ +************************************************************ +Complete WE2E Tests +************************************************************ + +.. csv-table:: + :file: CompleteTests.csv + :widths: 20,20,20,20,20,20,20,20 + :header-rows: 1 diff --git a/docs/UsersGuide/source/Components.rst b/docs/UsersGuide/source/Components.rst index 0e16702f11..028c87c851 100644 --- a/docs/UsersGuide/source/Components.rst +++ b/docs/UsersGuide/source/Components.rst @@ -74,14 +74,12 @@ After running ``manage_externals/checkout_externals``, the visualization scripts Build System and Workflow ========================= -The SRW Application has a portable build system and a user-friendly, modular, and -expandable workflow framework. +The SRW Application has a portable build system and a user-friendly, modular, and expandable workflow framework. An umbrella CMake-based build system is used for building the components necessary for running the end-to-end SRW Application, including the UFS Weather Model and the pre- and post-processing software. Additional libraries necessary for the application (e.g., :term:`NCEPLIBS-external` and :term:`NCEPLIBS`) are not included in the SRW Application build system but are available pre-built on pre-configured platforms. On other systems, they can be installed via the HPC-Stack (see :numref:`Chapter %s: Installing the HPC-Stack `). There is a small set of system libraries and utilities that are assumed to be present on the target computer: the CMake build software, a Fortran, C, and C++ compiler, and an :term:`MPI` library. Once built, the provided experiment generator script can be used to create a Rocoto-based -workflow file that will run each task in the system in the proper sequence (see `Rocoto documentation -`_). If Rocoto and/or a batch system is not present on the available platform, the individual components can be run in a stand-alone, command line fashion with provided run scripts. The generated namelist for the atmospheric model can be modified in order to vary settings such as forecast starting and ending dates, forecast length hours, the :term:`CCPP` physics suite, integration time step, history file output frequency, and more. It also allows for configuration of other elements of the workflow; for example, whether to run some or all of the pre-processing, forecast model, and post-processing steps. +workflow file that will run each task in the system in the proper sequence (see :numref:`Chapter %s ` or the `Rocoto documentation `_) for more information. If Rocoto and/or a batch system is not present on the available platform, the individual components can be run in a stand-alone, command line fashion with provided run scripts. The generated namelist for the atmospheric model can be modified in order to vary settings such as forecast starting and ending dates, forecast length hours, the :term:`CCPP` physics suite, integration time step, history file output frequency, and more. It also allows for configuration of other elements of the workflow; for example, whether to run some or all of the pre-processing, forecast model, and post-processing steps. This SRW Application release has been tested on a variety of platforms widely used by researchers, such as the NOAA Research and Development High-Performance Computing Systems diff --git a/docs/UsersGuide/source/ConfigNewPlatform.rst b/docs/UsersGuide/source/ConfigNewPlatform.rst deleted file mode 100644 index 29b2912978..0000000000 --- a/docs/UsersGuide/source/ConfigNewPlatform.rst +++ /dev/null @@ -1,390 +0,0 @@ -.. _ConfigNewPlatform: - -========================== -Configuring a New Platform -========================== - -The UFS SRW Application has been designed to work primarily on a number of Level 1 and 2 support platforms, as specified `here `__. However, it is also designed with flexibility in mind, so that any sufficiently up-to-date machine with a UNIX-based operating system should be capable of running the application. A full list of prerequisites for installing the UFS SRW App and running the Graduate Student Test can be found in :numref:`Section %s `. - -The first step to installing on a new machine is to install :term:`NCEPLIBS` (https://github.com/NOAA-EMC/NCEPLIBS), the NCEP libraries package, which is a set of libraries created and maintained by NCEP and EMC that are used in many parts of the UFS. NCEPLIBS comes with a large number of prerequisites (see :numref:`Section %s ` for more info), but the only required software prior to starting the installation process are as follows: - -* Fortran compiler with support for Fortran 2003 - - * gfortran v9+ or ifort v18+ are the only ones tested, but others may work. - -* C and C++ compilers compatible with the Fortran compiler - - * gcc v9+, ifort v18+, and clang v9+ (macOS, native Apple clang or LLVM clang) have been tested - -* Python v3.6+ - - * Prerequisite packages must be downloaded: jinja2, yaml and f90nml, as well as a number of additional Python modules (see :numref:`Section %s `) if the user would like to use the provided graphics scripts - -* Perl 5 - -* git v1.8+ - -* CMake v3.12+ - - * CMake v3.15+ is needed for building NCEPLIBS, but versions as old as 3.12 can be used to build NCEPLIBS-external, which contains a newer CMake that can be used for the rest of the build. - -For both Linux and macOS, you will need to set the stack size to "unlimited" (if allowed) or the largest possible value. - -.. code-block:: console - - # Linux, if allowed - ulimit -s unlimited - - # macOS, this corresponds to 65MB - ulimit -S -s unlimited - -For Linux systems, as long as the above software is available, you can move on to the next step: installing the :term:`NCEPLIBS-external` package. - -For macOS systems, some extra software is needed: ``wget``, ``coreutils``, ``pkg-config``, and ``gnu-sed``. -It is recommended that you install this software using the Homebrew package manager for macOS (https://brew.sh/): - -* brew install wget - -* brew install cmake - -* brew install coreutils - -* brew install pkg-config - -* brew install gnu-sed - -However, it is also possible to install these utilities via Macports (https://www.macports.org/), or installing each utility individually (not recommended). - -Installing NCEPLIBS-external -============================ -In order to facilitate the installation of NCEPLIBS (and therefore, the SRW App and other UFS applications) on new platforms, EMC maintains a one-stop package containing most of the prerequisite libraries and software necessary for installing NCEPLIBS. This package is known as NCEPLIBS-external, and is maintained in a git repository at https://github.com/NOAA-EMC/NCEPLIBS-external. Instructions for installing these will depend on your platform, but generally so long as all the above-mentioned prerequisites have been installed you can follow the proceeding instructions verbatim (in bash; a csh-based shell will require different commands). Some examples for installing on specific platforms can be found in the `NCEPLIBS-external/doc directory `. - - -These instructions will install the NCEPLIBS-external in the current directory tree, so be sure you are in the desired location before starting. - -.. code-block:: console - - export WORKDIR=`pwd` - export INSTALL_PREFIX=${WORKDIR}/NCEPLIBS-ufs-v2.0.0/ - export CC=gcc - export FC=gfortran - export CXX=g++ - -The CC, CXX, and FC variables should specify the C, C++, and Fortran compilers you will be using, respectively. They can be the full path to the compiler if necessary (for example, on a machine with multiple versions of the same compiler). It will be important that all libraries and utilities are built with the same set of compilers, so it is best to set these variables once at the beginning of the process and not modify them again. - -.. code-block:: console - - mkdir -p ${INSTALL_PREFIX}/src && cd ${INSTALL_PREFIX}/src - git clone -b release/public-v2 --recursive https://github.com/NOAA-EMC/NCEPLIBS-external - cd NCEPLIBS-external - mkdir build && cd build - cmake -DCMAKE_INSTALL_PREFIX=${INSTALL_PREFIX} .. 2>&1 | tee log.cmake - make -j4 2>&1 | tee log.make - -The previous commands go through the process of cloning the git repository for NCEPLIBS-external, creating and entering a build directory, and invoking cmake and make to build the code/libraries. The ``make`` step will take a while; as many as a few hours depending on your machine and various settings. It is highly recommended you use at least 4 parallel make processes to prevent overly long installation times. The ``-j4`` option in the make command specifies 4 parallel make processes, ``-j8`` would specify 8 parallel processes, while omitting the flag all together will run make serially (not recommended). - -If you would rather use a different version of one or more of the software packages included in NCEPLIBS-external, you can skip building individual parts of the package by including the proper flags in your call to cmake. For example: - -.. code-block:: console - - cmake -DBUILD_MPI=OFF -DCMAKE_INSTALL_PREFIX=${INSTALL_PREFIX} .. 2>&1 | tee log.cmake - -will skip the building of MPICH that comes with NCEPLIBS-external. See the readme file ``NCEPLIBS-external/README.md`` for more information on these flags, or for general troubleshooting. - -Once NCEPLIBS-external is installed, you can move on to installing NCEPLIBS. - -Installing NCEPLIBS -=================== -Prior to building the UFS SRW Application on a new machine, you will need to install NCEPLIBS. Installation instructions will again depend on your platform, but so long as NCEPLIBS-external has been installed successfully you should be able to build NCEPLIBS. The following instructions will install the NCEPLIBS in the same directory tree as was used for NCEPLIBS-external above, so if you did not install NCEPLIBS-external in the same way, you will need to modify these commands. - -.. code-block:: console - - cd ${INSTALL_PREFIX}/src - git clone -b release/public-v2 --recursive https://github.com/NOAA-EMC/NCEPLIBS - cd NCEPLIBS - mkdir build && cd build - export ESMFMKFILE=${INSTALL_PREFIX}/lib/esmf.mk - cmake -DCMAKE_INSTALL_PREFIX=${INSTALL_PREFIX} -DCMAKE_PREFIX_PATH=${INSTALL_PREFIX} -DOPENMP=ON .. 2>&1 | tee log.cmake - make -j4 2>&1 | tee log.make - make deploy 2>&1 | tee log.deploy - -As with NCEPLIBS-external, the above commands go through the process of cloning the git repository for NCEPLIBS, creating and entering a build directory, and invoking cmake and make to build the code. The ``make deploy`` step created a number of modulefiles and scripts that will be used for setting up the build environment for the UFS SRW App. The ``ESMFMKFILE`` variable allows NCEPLIBS to find the location where ESMF has been built; if you receive a ``ESMF not found, abort`` error, you may need to specify a slightly different location: - -.. code-block:: console - - export ESMFMKFILE=${INSTALL_PREFIX}/lib64/esmf.mk - -Then delete and re-create the build directory and continue the build process as described above. - -If you skipped the building of any of the software provided by NCEPLIBS-external, you may need to add the appropriate locations to your ``CMAKE_PREFIX_PATH`` variable. Multiple directories may be added, separated by semicolons (;) like in the following example: - -.. code-block:: console - - cmake -DCMAKE_INSTALL_PREFIX=${INSTALL_PREFIX} -DCMAKE_PREFIX_PATH=”${INSTALL_PREFIX};/location/of/other/software” -DOPENMP=ON .. 2>&1 | tee log.cmake - -Further information on including prerequisite libraries, as well as other helpful tips, can be found in the ``NCEPLIBS/README.md`` file. - -Once the NCEPLIBS package has been successfully installed, you can move on to building the UFS SRW Application. - -Building the UFS SRW Application -======================================= -Building the UFS SRW App is similar to building NCEPLIBS, in that the code is stored in a git repository and is built using CMake software. The first step is to retrieve the code from GitHub, using the variables defined earlier: - -.. code-block:: console - - cd ${WORKDIR} - git clone -b release/public-v1 https://github.com/ufs-community/ufs-srweather-app.git - cd ufs-srweather-app/ - ./manage_externals/checkout_externals - -Here the procedure differs a bit from NCEPLIBS and NCEPLIBS-external. The UFS SRW App is maintained using an umbrella git repository that collects the individual components of the application from their individual, independent git repositories. This is handled using "Manage Externals" software, which is included in the application; this is the final step listed above, which should output a bunch of dialogue indicating that it is retrieving different code repositories as described in :numref:`Table %s `. It may take several minutes to download these repositories. - -Once the Manage Externals step has completed, you will need to make sure your environment is set up so that the UFS SRW App can find all of the prerequisite software and libraries. There are a few ways to do this, the simplest of which is to load a modulefile if your machine supports Lua Modules: - -.. code-block:: console - - module use ${INSTALL_PREFIX}/modules - module load NCEPLIBS/2.0.0 - -If your machine does not support Lua but rather TCL modules, see instructions in the ``NCEPLIBS/README.md`` file for converting to TCL modulefiles. - -If your machine does not support modulefiles, you can instead source the provided bash script for setting up the environment: - -.. code-block:: console - - source ${INSTALL_PREFIX}/bin/setenv_nceplibs.sh - -This script, just like the modulefiles, will set a number of environment variables that will allow CMake to easily find all the libraries that were just built. There is also a csh version of the script in the same directory if your shell is csh-based. If you are using your machine’s pre-built version of any of the NCEP libraries (not recommended), reference that file to see which variables should be set to point CMake in the right direction. - -At this point there are just a few more variables that need to be set prior to building: - -.. code-block:: console - - export CMAKE_C_COMPILER=mpicc - export CMAKE_CXX_COMPILER=mpicxx - export CMAKE_Fortran_COMPILER=mpifort - -If you are using your machine’s built-in MPI compilers, it is recommended you set the ``CMAKE_*_COMPILER`` flags to full paths to ensure that the correct MPI aliases are used. Finally, one last environment variable, ``CMAKE_Platform``, must be set. This will depend on your machine; for example, on a macOS operating system with GNU compilers: - -.. code-block:: console - - export CMAKE_Platform=macosx.gnu - -This is the variable used by the weather model to set a few additional flags based on your machine. The available options can be found `here `_. - -Now all the prerequisites have been installed and variables set, so you should be ready to build the model! - -.. code-block:: console - - mkdir build && cd build - cmake .. -DCMAKE_INSTALL_PREFIX=.. | tee log.cmake - make -j4 | tee log.make - -On many platforms this build step will take less than 30 minutes, but for some machines it may take up to a few hours, depending on the system architecture, compiler and compiler flags, and number of parallel make processes used. - -Setting Up Your Python Environment -================================== -The regional_workflow repository contains scripts for generating and running experiments, and these require some specific python packages to function correctly. First, as mentioned before, your platform will need Python 3.6 or newer installed. Once this is done, you will need to install several python packages that are used by the workflow: ``jinja2`` (https://jinja2docs.readthedocs.io/), ``pyyaml`` (https://pyyaml.org/wiki/PyYAML), and ``f90nml`` (https://pypi.org/project/f90nml/). These packages can be installed individually, but it is recommended you use a package manager (https://www.datacamp.com/community/tutorials/pip-python-package-manager). - -If you have conda on your machine: - -.. code-block:: console - - conda install jinja2 pyyaml f90nml - -Otherwise you may be able to use pip3 (the Python3 package manager; may need to be installed separately depending on your platform): - -.. code-block:: console - - pip3 install jinja2 pyyaml f90nml - -Running the graphics scripts in ``${WORKDIR}/ufs-srweather-app/regional_workflow/ush/Python`` will require the additional packages ``pygrib``, ``cartopy``, ``matplotlib``, ``scipy``, and ``pillow``. These can be installed in the same way as described above. - -For the final step of creating and running an experiment, the exact methods will depend on if you are running with or without a workflow manager (Rocoto). - -Running Without a Workflow Manager: Generic Linux and macOS Platforms -===================================================================== -Now that the code has been built, you can stage your data as described in :numref:`Section %s `. - -Once the data has been staged, setting up your experiment on a platform without a workflow manager is similar to the procedure for other platforms described in earlier chapters. Enter the ``${WORKDIR}/ufs-srweather-app/regional_workflow/ush`` directory and configure the workflow by creating a ``config.sh`` file as described in :numref:`Chapter %s `. There will be a few specific settings that you may need change prior to generating the experiment compared to the instructions for pre-configured platforms: - -``MACHINE="MACOS" or MACHINE="LINUX"`` - These are the two ``MACHINE`` settings for generic, non-Rocoto-based platforms; you should choose the one most appropriate for your machine. ``MACOS`` has its own setting due to some differences in how command-line utilities function on Darwin-based operating systems. - -``LAYOUT_X=2`` - -``LAYOUT_Y=2`` - These are the settings that control the MPI decomposition when running the weather model. There are default values, but for your machine it is recommended that you specify your own layout to achieve the correct number of MPI processes for your application. In total, your machine should be able to handle ``LAYOUT_X×LAYOUT_Y+WRTCMP_write_tasks_per_group`` tasks. ``WRTCMP_write_tasks_per_group`` is the number of MPI tasks that will be set aside for writing model output, and it is a setting dependent on the domain you have selected. You can find and edit the value of this variable in the file ``regional_workflow/ush/set_predef_grid_params.sh``. - -``RUN_CMD_UTILS="mpirun -np 4"`` - This is the run command for MPI-enabled pre-processing utilities. Depending on your machine and your MPI installation, you may need to use a different command for launching an MPI-enabled executable. - -``RUN_CMD_POST="mpirun -np 1"`` - This is the same as RUN_CMD_UTILS but for UPP. - -``RUN_CMD_FCST='mpirun -np ${PE_MEMBER01}'`` - This is the run command for the weather model. It is **strongly** recommended that you use the variable ``${PE_MEMBER01}`` here, which is calculated within the workflow generation script (based on the layout and write tasks described above) and is the number of MPI tasks that the weather model will expect to run with. Running the weather model with a different number of MPI tasks than the workflow has been set up for can lead to segmentation faults and other errors. It is also important to use single quotes here (or escape the “$” character) so that ``PE_MEMBER01`` is not referenced until runtime, since it is not defined at the beginning of the workflow generation script. - -``FIXgsm=${WORKDIR}/data/fix_am`` - The location of the ``fix_am`` static files. This and the following two static data sets will need to be downloaded to your machine, as described in :numref:`Section %s `. - -``TOPO_DIR=${WORKDIR}/data/fix_orog`` - Location of ``fix_orog`` static files - -``SFC_CLIMO_INPUT_DIR=${WORKDIR}/data/fix_sfc_climo`` - Location of ``climo_fields_netcdf`` static files - -Once you are happy with your settings in ``config.sh``, it is time to run the workflow and move to the experiment directory (that is printed at the end of the script’s execution): - -.. code-block:: console - - ./generate_FV3LAM_wflow.sh - export EXPTDIR="your experiment directory" - cd $EXPTDIR - -From here, you can run each individual task of the UFS SRW App using the provided run scripts: - -.. code-block:: console - - cp ${WORKDIR}/ufs-srweather-app/regional_workflow/ush/wrappers/*sh . - cp ${WORKDIR}/ufs-srweather-app/regional_workflow/ush/wrappers/README.md . - -The ``README.md`` file will contain instructions on the order that each script should be run in. An example of wallclock times for each task for an example run (2017 Macbook Pro, macOS Catalina, 25km CONUS domain, 48hr forecast) is listed in :numref:`Table %s `. - -.. _WallClockTimes: - -.. table:: Example wallclock times for each workflow task. - - - +--------------------+----------------------------+------------+-----------+ - | **UFS Component** | **Script Name** | **Num.** | **Wall** | - | | | **Cores** | **time** | - +====================+============================+============+===========+ - | UFS_UTILS | ./run_get_ics.sh | n/a | 3 s | - +--------------------+----------------------------+------------+-----------+ - | UFS_UTILS | ./run_get_lbcs.sh | n/a | 3 s | - +--------------------+----------------------------+------------+-----------+ - | UFS_UTILS | ./run_make_grid.sh | n/a | 9 s | - +--------------------+----------------------------+------------+-----------+ - | UFS_UTILS | ./run_make_orog.sh | 4 | 1 m | - +--------------------+----------------------------+------------+-----------+ - | UFS_UTILS | ./run_make_sfc_climo.sh | 4 | 27 m | - +--------------------+----------------------------+------------+-----------+ - | UFS_UTILS | ./run_make_ics.sh | 4 | 5 m | - +--------------------+----------------------------+------------+-----------+ - | UFS_UTILS | ./run_make_lbcs.sh | 4 | 5 m | - +--------------------+----------------------------+------------+-----------+ - | ufs-weather-model | ./run_fcst.sh | 6 | 1h 40 m | - +--------------------+----------------------------+------------+-----------+ - | UPP | ./run_post.sh | 1 | 7 m | - +--------------------+----------------------------+------------+-----------+ - -Running on a New Platform with Rocoto Workflow Manager -====================================================== -All official HPC platforms for the UFS SRW App release make use of the Rocoto workflow management software for running experiments. If you would like to use the Rocoto workflow manager on a new machine, you will have to make modifications to the scripts in the ``regional_workflow`` repository. The easiest way to do this is to search the files in the ``regional_workflow/scripts`` and ``regional_workflow/ush`` directories for an existing platform name (e.g. ``CHEYENNE``) and add a stanza for your own unique machine (e.g. ``MYMACHINE``). As an example, here is a segment of code from ``regional_workflow/ush/setup.sh``, where the highlighted text is an example of the kind of change you will need to make: - -.. code-block:: console - :emphasize-lines: 11-18 - - ... - "CHEYENNE") - WORKFLOW_MANAGER="rocoto" - NCORES_PER_NODE=36 - SCHED="${SCHED:-pbspro}" - QUEUE_DEFAULT=${QUEUE_DEFAULT:-"regular"} - QUEUE_HPSS=${QUEUE_HPSS:-"regular"} - QUEUE_FCST=${QUEUE_FCST:-"regular"} - ;; - - "MYMACHINE") - WORKFLOW_MANAGER="rocoto" - NCORES_PER_NODE=your_machine_cores_per_node - SCHED="${SCHED:-your_machine_scheduler}" - QUEUE_DEFAULT=${QUEUE_DEFAULT:-"your_machine_queue_name"} - QUEUE_HPSS=${QUEUE_HPSS:-"your_machine_queue_name"} - QUEUE_FCST=${QUEUE_FCST:-"your_machine_queue_name"} - ;; - - "STAMPEDE") - WORKFLOW_MANAGER="rocoto" - ... - -You will also need to add ``MYMACHINE`` to the list of valid machine names in ``regional_workflow/ush/valid_param_vals.sh``. The minimum list of files that will need to be modified in this way are as follows (all in the ``regional_workflow`` repository): - -* ``scripts/exregional_run_post.sh``, line 131 -* ``scripts/exregional_make_sfc_climo.sh``, line 162 -* ``scripts/exregional_make_lbcs.sh``, line 114 -* ``scripts/exregional_make_orog.sh``, line 147 -* ``scripts/exregional_make_grid.sh``, line 145 -* ``scripts/exregional_run_fcst.sh``, line 140 -* ``scripts/exregional_make_ics.sh``, line 114 -* ``ush/setup.sh``, lines 431 and 742 -* ``ush/launch_FV3LAM_wflow.sh``, line 104 -* ``ush/get_extrn_mdl_file_dir_info.sh``, many lines, starting around line 589 -* ``ush/valid_param_vals.sh``, line 3 -* ``ush/load_modules_run_task.sh``, line 126 -* ``ush/set_extrn_mdl_params.sh``, many lines, starting around line 61 - -The line numbers may differ slightly given future bug fixes. Additionally, you may need to make further changes depending on the exact setup of your machine and Rocoto installation. Information about installing and configuring Rocoto on your machine can be found in the Rocoto GitHub repository: https://github.com/christopherwharrop/rocoto - -.. _SW-OS-Requirements: - -Software/Operating System Requirements -====================================== -Those requirements highlighted in **bold** are included in the NCEPLIBS-external (https://github.com/NOAA-EMC/NCEPLIBS-external) package. - -**Minimum platform requirements for the UFS SRW Application and NCEPLIBS:** - -* POSIX-compliant UNIX-style operating system - -* >40 GB disk space - - * 18 GB input data from GFS, RAP, and HRRR for Graduate Student Test - * 6 GB for NCEPLIBS-external and NCEPLIBS full installation - * 1 GB for ufs-srweather-app installation - * 11 GB for 48hr forecast on CONUS 25km domain - -* 4GB memory (CONUS 25km domain) - -* Fortran compiler with full Fortran 2008 standard support - -* C and C++ compiler - -* Python v3.6+, including prerequisite packages ``jinja2``, ``pyyaml`` and ``f90nml`` - -* Perl 5 - -* git v1.8+ - -* MPI (**MPICH**, OpenMPI, or other implementation) - -* CMake v3.12+ - -* Software libraries - - * **netCDF (C and Fortran libraries)** - * **HDF5** - * **ESMF** 8.2.0 - * **Jasper** - * **libJPG** - * **libPNG** - * **zlib** - -.. - COMMENT: Update version of ESMF? Need other version updates? - -macOS-specific prerequisites: - -* brew install wget -* brew install cmake -* brew install coreutils -* brew install pkg-config -* brew install gnu-sed - -Optional but recommended prerequisites: - -* Conda for installing/managing Python packages -* Bash v4+ -* Rocoto Workflow Management System (1.3.1) -* **CMake v3.15+** -* Python packages scipy, matplotlib, pygrib, cartopy, and pillow for graphics diff --git a/docs/UsersGuide/source/ConfigWorkflow.rst b/docs/UsersGuide/source/ConfigWorkflow.rst index 306562d317..9fcfbd8b11 100644 --- a/docs/UsersGuide/source/ConfigWorkflow.rst +++ b/docs/UsersGuide/source/ConfigWorkflow.rst @@ -5,9 +5,6 @@ Workflow Parameters: Configuring the Workflow in ``config.sh`` and ``config_defa ============================================================================================ To create the experiment directory and workflow when running the SRW App, the user must create an experiment configuration file named ``config.sh``. This file contains experiment-specific information, such as dates, external model data, observation data, directories, and other relevant settings. To help the user, two sample configuration files have been included in the ``regional_workflow`` repository’s ``ush`` directory: ``config.community.sh`` and ``config.nco.sh``. The first is for running experiments in community mode (``RUN_ENVIR`` set to "community"; see below), and the second is for running experiments in "nco" mode (``RUN_ENVIR`` set to "nco"). Note that for this release, only "community" mode is supported. These files can be used as the starting point from which to generate a variety of experiment configurations in which to run the SRW App. -.. - COMMENT: Is community mode still the only one supported? - There is an extensive list of experiment parameters that a user can set when configuring the experiment. Not all of these need to be explicitly set by the user in ``config.sh``. If a user does not define an entry in the ``config.sh`` script, either its value in ``config_defaults.sh`` will be used, or it will be reset depending on other parameters, such as the platform on which the experiment will be run (specified by ``MACHINE``). Note that ``config_defaults.sh`` contains the full list of experiment parameters that a user may set in ``config.sh`` (i.e., the user cannot set parameters in config.sh that are not initialized in ``config_defaults.sh``). The following is a list of the parameters in the ``config_defaults.sh`` file. For each parameter, the default value and a brief description is given. In addition, any relevant information on features and settings supported or unsupported in this release is specified. @@ -27,9 +24,6 @@ Platform Environment ``MACHINE``: (Default: "BIG_COMPUTER") The machine (a.k.a. platform) on which the workflow will run. Currently supported platforms include "WCOSS_DELL_P3", "HERA", "ORION", "JET", "ODIN", "CHEYENNE", "STAMPEDE", "GAEA", "SINGULARITY", "NOAACLOUD", "MACOS", and "LINUX". When running the SRW App in a container, set ``MACHINE`` to "SINGULARITY" regardless of the underlying platform. -.. - COMMENT: Are we deleting WCOSS_CRAY and/or GAEA? They're not listed in valid_param_vals.sh. What is the difference between SINGULARITY & NOAACLOUD? Can we use just one? Any other machines to add? - ``MACHINE_FILE``: (Default: "") Path to a configuration file with machine-specific settings. If none is provided, ``setup.sh`` will attempt to set the path to a configuration file for a supported platform. @@ -93,10 +87,10 @@ Parameters for Running Without a Workflow Manager These settings control run commands for platforms without a workflow manager. Values will be ignored unless ``WORKFLOW_MANAGER="none"``. ``RUN_CMD_UTILS``: (Default: "mpirun -np 1") - The run command for pre-processing utilities (shave, orog, sfc_climo_gen, etc.). This can be left blank for smaller domains, in which case the executables will run without :term:`MPI`. + The run command for MPI-enabled pre-processing utilities (e.g., shave, orog, sfc_climo_gen). This can be left blank for smaller domains, in which case the executables will run without :term:`MPI`. Users may need to use a different command for launching an MPI-enabled executable depending on their machine and MPI installation. ``RUN_CMD_FCST``: (Default: "mpirun -np \${PE_MEMBER01}") - The run command for the model forecast step. This will be appended to the end of the variable definitions file (``var_defns.sh``). + The run command for the model forecast step. This will be appended to the end of the variable definitions file (``var_defns.sh``). Changing the ``${PE_MEMBER01}`` variable is **not** recommended; it refers to the number of MPI tasks that the Weather Model will expect to run with. Running the Weather Model with a different number of MPI tasks than the workflow has been set up for can lead to segmentation faults and other errors. It is also important to escape the ``$`` character or use single quotes here so that ``PE_MEMBER01`` is not referenced until runtime, since it is not defined at the beginning of the workflow generation script. ``RUN_CMD_POST``: (Default: "mpirun -np 1") The run command for post-processing (:term:`UPP`). Can be left blank for smaller domains, in which case UPP will run without :term:`MPI`. @@ -320,8 +314,8 @@ METplus Parameters ``MRMS_OBS_DIR``: (Default: "") User-specified location of top-level directory where MRMS composite reflectivity files used by METplus are located. This parameter needs to be set for both user-provided observations and for observations that are retrieved from the NOAA HPSS (if the user has access) via the ``get_obs_mrms_tn`` task (activated in the workflow by setting ``RUN_TASK_GET_OBS_MRMS="TRUE"``). When pulling observations directly from NOAA HPSS, the data retrieved will be placed in this directory. Please note, this path must be defind as ``//mrms/proc``. METplus configuration files require the use of a predetermined directory structure and file names. Therefore, if the MRMS files are user-provided, they need to follow the anticipated naming structure: ``{YYYYMMDD}/MergedReflectivityQCComposite_00.50_{YYYYMMDD}-{HH}{mm}{SS}.grib2``, where YYYYMMDD and {HH}{mm}{SS} are as described in the note :ref:`above `. - .. note:: - METplus is configured to look for a MRMS composite reflectivity file for the valid time of the forecast being verified; since MRMS composite reflectivity files do not always exactly match the valid time, a script, within the main script to retrieve MRMS data from the NOAA HPSS, is used to identify and rename the MRMS composite reflectivity file to match the valid time of the forecast. The script to pull the MRMS data from the NOAA HPSS has an example of the expected file naming structure: ``regional_workflow/scripts/exregional_get_mrms_files.sh``. This script calls the script used to identify the MRMS file closest to the valid time: ``regional_workflow/ush/mrms_pull_topofhour.py``. +.. note:: + METplus is configured to look for a MRMS composite reflectivity file for the valid time of the forecast being verified; since MRMS composite reflectivity files do not always exactly match the valid time, a script, within the main script to retrieve MRMS data from the NOAA HPSS, is used to identify and rename the MRMS composite reflectivity file to match the valid time of the forecast. The script to pull the MRMS data from the NOAA HPSS has an example of the expected file naming structure: ``regional_workflow/scripts/exregional_get_mrms_files.sh``. This script calls the script used to identify the MRMS file closest to the valid time: ``regional_workflow/ush/mrms_pull_topofhour.py``. ``NDAS_OBS_DIR``: (Default: "") @@ -415,8 +409,6 @@ CCPP Parameter | "FV3_GFS_v15_thompson_mynn_lam3km" | "FV3_RRFS_v1alpha" -.. - COMMENT: "FV3_WoFS" technically has not been merged yet... and is called NSSL? What should I put for now? Current Default is "FV3_GFS_v15p2" - need to make sure we change that. Stochastic Physics Parameters ================================ @@ -510,10 +502,7 @@ Stochastic Kinetic Energy Backscatter (SKEB) Parameters * 2-pattern is vorticity ``SKEB_VDOF``: (Default: "10") - The number of degrees of freedom in the vertical for the SKEB random pattern. - -.. - COMMENT: The vertical what? + The number of degrees of freedom in the vertical direction for the SKEB random pattern. .. _SPP: @@ -561,9 +550,6 @@ Set default Stochastically Perturbed Parameterizations (SPP) stochastic physics ``SPP_VAR_LIST``: (Default: ( "pbl" "sfc" "mp" "rad" "gwd" ) ) The list of parameterizations to perturb: planetary boundary layer (PBL), surface physics (SFC), microphysics (MP), radiation (RAD), gravity wave drag (GWD). Valid values: "pbl", "sfc", "rad", "gwd", and "mp". -.. - COMMENT: Needs review. Is "rad" radiation? Need confiromation. - Land Surface Model (LSM) SPP ------------------------------- @@ -728,22 +714,20 @@ Computational Forecast Parameters #. If the experiment is using a predefined grid and the user sets the ``BLOCKSIZE`` parameter in the user-specified experiment configuration file (i.e., ``config.sh``), that value will be used in the forecast(s). Otherwise, the default ``BLOCKSIZE`` for that predefined grid will be used. #. If the experiment is *not* using a predefined grid (i.e., it is using a custom grid whose parameters are specified in the experiment configuration file), then the user must specify a value for the ``BLOCKSIZE`` parameter in that configuration file. Otherwise, it will remain set to a null string, and the experiment generation will fail, because the generation scripts check to ensure that all the parameters defined in this section are set to non-empty strings before creating the experiment directory. +.. _WriteComp: Write-Component (Quilting) Parameters ====================================== .. note:: - The :term:`UPP` (called by the ``RUN_POST_TN`` task) cannot process output on the native grid types ("GFDLgrid" and "ESGgrid"), so output fields are interpolated to a write-component grid before writing them to an output file. The output files written by the UFS Weather Model model use an Earth System Modeling Framework (ESMF) component, referred to as the write component. This model component is configured with settings in the ``model_configure`` file, as described in `Section 4.2.3 `__ of the UFS Weather Model documentation. + The :term:`UPP` (called by the ``RUN_POST_TN`` task) cannot process output on the native grid types ("GFDLgrid" and "ESGgrid"), so output fields are interpolated to a **write-component grid** before writing them to an output file. The output files written by the UFS Weather Model model use an Earth System Modeling Framework (ESMF) component, referred to as the **write component**. This model component is configured with settings in the ``model_configure`` file, as described in `Section 4.2.3 `__ of the UFS Weather Model documentation. ``QUILTING``: (Default: "TRUE") - .. attention:: - The regional grid requires the use of the write component, so users generally should not need to change the default value for ``QUILTING``. - - Flag that determines whether to use the write component for writing forecast output files to disk. If set to "TRUE", the forecast model will output files named ``dynf$HHH.nc`` and ``phyf$HHH.nc`` (where HHH is the 3-hour output forecast hour) containing dynamics and physics fields, respectively, on the write-component grid. (The regridding from the native FV3-LAM grid to the write-component grid is done by the forecast model.) If ``QUILTING`` is set to "FALSE", then the output file names are ``fv3_history.nc`` and ``fv3_history2d.nc``, and they contain fields on the native grid. Although the UFS Weather Model can run without quilting, the regional grid requires the use of the write component. Therefore, QUILTING should be set to "TRUE" when running the SRW App. If ``QUILTING`` is set to "FALSE", the ``RUN_POST_TN`` (meta)task cannot run because the :term:`UPP` code that this task calls cannot process fields on the native grid. In that case, the ``RUN_POST_TN`` (meta)task will be automatically removed from the Rocoto workflow XML. The :ref:`INLINE POST ` option also requires ``QUILTING`` to be set to "TRUE" in the SRW App. +.. attention:: + The regional grid requires the use of the write component, so users generally should not need to change the default value for ``QUILTING``. -.. - COMMENT: Still don't undertand what HHH refers to... can we give an example? + Flag that determines whether to use the write component for writing forecast output files to disk. If set to "TRUE", the forecast model will output files named ``dynf$HHH.nc`` and ``phyf$HHH.nc`` (where HHH is the 3-digit forecast hour) containing dynamics and physics fields, respectively, on the write-component grid. For example, the output files for the 3rd hour of the forecast would be ``dynf$003.nc`` and ``phyf$003.nc``. (The regridding from the native FV3-LAM grid to the write-component grid is done by the forecast model.) If ``QUILTING`` is set to "FALSE", then the output file names are ``fv3_history.nc`` and ``fv3_history2d.nc``, and they contain fields on the native grid. Although the UFS Weather Model can run without quilting, the regional grid requires the use of the write component. Therefore, QUILTING should be set to "TRUE" when running the SRW App. If ``QUILTING`` is set to "FALSE", the ``RUN_POST_TN`` (meta)task cannot run because the :term:`UPP` code that this task calls cannot process fields on the native grid. In that case, the ``RUN_POST_TN`` (meta)task will be automatically removed from the Rocoto workflow XML. The :ref:`INLINE POST ` option also requires ``QUILTING`` to be set to "TRUE" in the SRW App. ``PRINT_ESMF``: (Default: "FALSE") Flag that determines whether to output extra (debugging) information from ESMF routines. Must be "TRUE" or "FALSE". Note that the write component uses ESMF library routines to interpolate from the native forecast model grid to the user-specified output grid (which is defined in the model configuration file ``model_configure`` in the forecast run directory). @@ -755,10 +739,7 @@ Write-Component (Quilting) Parameters The number of MPI tasks to allocate for each write group. ``WRTCMP_output_grid``: (Default: "''") - Sets the type (coordinate system) of the write component grid. The default empty string forces the user to set a valid value for ``WRTCMP_output_grid`` in ``config.sh`` if specifying a *custom* grid. Otherwise, the ordinary "regional_latlon" grid will be used. Valid values: "lambert_conformal" "regional_latlon" "rotated_latlon" - -.. - COMMENT: If no value is specified in config.sh, would setup.sh (or some other script?) use the ordinary "regional_latlon"? Or would the experiment just fail? + Sets the type (coordinate system) of the write component grid. The default empty string forces the user to set a valid value for ``WRTCMP_output_grid`` in ``config.sh`` if specifying a *custom* grid. When creating an experiment with a user-defined grid, this parameter must be specified or the experiment will fail. Valid values: "lambert_conformal" "regional_latlon" "rotated_latlon" ``WRTCMP_cen_lon``: (Default: "") Longitude (in degrees) of the center of the write component grid. Can usually be set to the corresponding value from the native grid. @@ -767,13 +748,10 @@ Write-Component (Quilting) Parameters Latitude (in degrees) of the center of the write component grid. Can usually be set to the corresponding value from the native grid. ``WRTCMP_lon_lwr_left``: (Default: "") - Longitude (in degrees) of the center of the lower-left (southwest) cell on the write component grid. If using the "rotated_latlon" coordinate system, this is expressed in terms of the rotated longitude. Must be set manually. - -.. - COMMENT: Has this changed? Or still manual? + Longitude (in degrees) of the center of the lower-left (southwest) cell on the write component grid. If using the "rotated_latlon" coordinate system, this is expressed in terms of the rotated longitude. Must be set manually when running an experiment with a user-defined grid. ``WRTCMP_lat_lwr_left``: (Default: "") - Latitude (in degrees) of the center of the lower-left (southwest) cell on the write component grid. If using the "rotated_latlon" coordinate system, this is expressed in terms of the rotated latitude. Must be set manually. + Latitude (in degrees) of the center of the lower-left (southwest) cell on the write component grid. If using the "rotated_latlon" coordinate system, this is expressed in terms of the rotated latitude. Must be set manually when running an experiment with a user-defined grid. **The following parameters must be set when** ``WRTCMP_output_grid`` **is set to "rotated_latlon":** @@ -819,15 +797,13 @@ Predefined Grid Parameters | "RRFS_CONUS_25km" | "RRFS_CONUS_13km" - | "RRFS_CONUS_3km" + | "RRFS_CONUS_3km" + | "SUBCONUS_Ind_3km" **Other valid values include:** - | "RRFS_SUBCONUS_3km" - | "RRFS_AK_13km" - | "RRFS_AK_3km" | "CONUS_25km_GFDLgrid" - | "CONUS_3km_GFDLgrid" + | "CONUS_3km_GFDLgrid" | "EMC_AK" | "EMC_HI" | "EMC_PR" @@ -835,12 +811,16 @@ Predefined Grid Parameters | "GSL_HAFSV0.A_25km" | "GSL_HAFSV0.A_13km" | "GSL_HAFSV0.A_3km" - | "GSD_HRRR_AK_50km" + | "GSD_HRRR_AK_50km" + | "RRFS_AK_13km" + | "RRFS_AK_3km" + | "RRFS_CONUScompact_25km" + | "RRFS_CONUScompact_13km" + | "RRFS_CONUScompact_3km" | "RRFS_NA_13km" | "RRFS_NA_3km" - -.. - COMMENT: Are all of these now being supported or still just the three main ones? Am I missing any? + | "RRFS_SUBCONUS_3km" + | "WoFS_3km" .. note:: @@ -871,6 +851,8 @@ Debug Parameter ``DEBUG``: (Default: "FALSE") Flag that determines whether to print out very detailed debugging messages. Note that if DEBUG is set to TRUE, then VERBOSE will also get reset to TRUE if it isn't already. Valid values: "TRUE" "true" "YES" "yes" "FALSE" "false" "NO" "no" +.. _WFTasks: + Rocoto Workflow Tasks ======================== @@ -1104,10 +1086,6 @@ Verification Tasks ``RUN_TASK_GET_OBS_NDAS``: (Default: "FALSE") Flag that determines whether to run the ``GET_OBS_NDAS_TN`` task, which retrieves the :term:`NDAS` PrepBufr files used by METplus from NOAA HPSS. -.. - COMMENT: Need confirmation that the above 3 task explanations are correct. - - ``RUN_TASK_VX_GRIDSTAT``: (Default: "FALSE") Flag that determines whether to run the grid-stat verification task. @@ -1283,10 +1261,7 @@ Subhourly Forecast Parameters Flag that indicates whether the forecast model will generate output files on a sub-hourly time interval (e.g., 10 minutes, 15 minutes). This will also cause the post-processor to process these sub-hourly files. If this variable is set to "TRUE", then ``DT_SUBHOURLY_POST_MNTS`` should be set to a value between "01" and "59". ``DT_SUB_HOURLY_POST_MNTS``: (Default: "00") - Time interval in minutes between the forecast model output files. If ``SUB_HOURLY_POST`` is set to "TRUE", this needs to be set to a two-digit integer between "01" and "59". Note that if ``SUB_HOURLY_POST`` is set to "TRUE" but ``DT_SUB_HOURLY_POST_MNTS`` is set to "00", ``SUB_HOURLY_POST`` will get reset to "FALSE" in the experiment generation scripts (there will be an informational message in the log file to emphasize this). - -.. - COMMENT: In valid_param_vals.sh only these values are listed: "1" "01" "2" "02" "3" "03" "4" "04" "5" "05" "6" "06" "10" "12" "15" "20" "30". + Time interval in minutes between the forecast model output files. If ``SUB_HOURLY_POST`` is set to "TRUE", this needs to be set to a two-digit integer between "01" and "59". Note that if ``SUB_HOURLY_POST`` is set to "TRUE" but ``DT_SUB_HOURLY_POST_MNTS`` is set to "00", ``SUB_HOURLY_POST`` will get reset to "FALSE" in the experiment generation scripts (there will be an informational message in the log file to emphasize this). Valid values: "1" "01" "2" "02" "3" "03" "4" "04" "5" "05" "6" "06" "10" "12" "15" "20" "30". Customized Post Configuration Parameters ======================================== @@ -1301,7 +1276,7 @@ Customized Post Configuration Parameters Community Radiative Transfer Model (CRTM) Parameters ======================================================= -These variables set parameters associated with outputting satellite fields in the :term:`UPP` :term:`grib2` files using the Community Radiative Transfer Model (:term:`CRTM`). +These variables set parameters associated with outputting satellite fields in the :term:`UPP` :term:`grib2` files using the Community Radiative Transfer Model (:term:`CRTM`). :numref:`Section %s ` includes further instructions on how to do this. .. COMMENT: What actually happens here? Where are the satellite fields outputted to? When/why would this be used? What kind of satellites? diff --git a/docs/UsersGuide/source/Glossary.rst b/docs/UsersGuide/source/Glossary.rst index ba07c66f26..34154fda30 100644 --- a/docs/UsersGuide/source/Glossary.rst +++ b/docs/UsersGuide/source/Glossary.rst @@ -6,6 +6,15 @@ Glossary .. glossary:: + advect + To transport substances in the atmostphere by :term:`advection`. + + advection + According to the American Meteorological Society (AMS) `definition `__, advection is "The process of transport of an atmospheric property solely by the mass motion (velocity field) of the atmosphere." In common parlance, advection is movement of atmospheric substances that are carried around by the wind. + + CAPE + Convective Available Potential Energy. + CCPA Climatology-Calibrated Precipitation Analysis (CCPA) data. This data is required for METplus precipitation verification tasks within the SRW App. The most recent 8 days worth of data are publicly available and can be accessed `here `__. @@ -16,6 +25,9 @@ Glossary The preprocessing software used to create initial and boundary condition files to “coldstart” the forecast model. + CIN + Convective Inhibition. + CRTM `Community Radiative Transfer Model `__. CRTM is a fast and accurate radiative transfer model developed at the `Joint Center for Satellite Data Assimilation `__ (JCSDA) in the United States. It is a sensor-based radiative transfer model and supports more than 100 sensors, including sensors on most meteorological satellites and some from other remote sensing satellites. @@ -31,6 +43,10 @@ Glossary CONUS Continental United States + convection-allowing models + CAM + Convection-allowing models (CAMs) are models that run on high-resolution grids (usually with grid spacing at 4km or less) that are able to resolve the effects of small-scale convective processes. They typically run several times a day to provide frequent forecasts (e.g., hourly or even subhourly). + cycle An hour of the day on which a forecast is started. @@ -49,9 +65,15 @@ Glossary echo top The radar-indicated top of an area of precipitation. Specifically, it contains the height of the 18 dBZ reflectivity value. + EMC + The `Environmental Modeling Center `__. + EPIC EPIC stands for the `Earth Prediction Innovation Center `__. EPIC seeks to accelerate scientific research and modeling contributions through continuous and sustained community engagement to produce the most accurate and reliable operational modeling system in the world. + ESG + Extended Schmidt Gnomonic (ESG) grid. The ESG grid uses the map projection developed by Jim Purser of NOAA :term:`EMC` (:cite:t:`Purser_2020`). + ESMF `Earth System Modeling Framework `__. The ESMF defines itself as “a suite of software tools for developing high-performance, multi-component Earth science modeling applications.” @@ -76,18 +98,24 @@ Glossary HPC-Stack The `HPC-Stack `__ is a repository that provides a unified, shell script-based build system for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) `__ and the `Joint Effort for Data assimilation Integration (JEDI) `__ framework. + HPSS + NOAA HPSS + National Oceanic and Atmospheric Administration (NOAA) High Performance Storage System (HPSS). + HRRR `High Resolution Rapid Refresh `__. The HRRR is a NOAA real-time 3-km resolution, hourly updated, cloud-resolving, convection-allowing atmospheric model, initialized by 3km grids with 3km radar assimilation. Radar data is assimilated in the HRRR every 15 min over a 1-h period adding further detail to that provided by the hourly data assimilation from the 13km radar-enhanced Rapid Refresh. IC/LBC Initial conditions/lateral boundary conditions + IC ICs Initial conditions LAM Limited Area Model, formerly known as the "Stand-Alone Regional Model," or SAR. LAM grids use a regional (rather than global) configuration of the FV3 dynamical core. + LBC LBCs Lateral boundary conditions @@ -132,6 +160,9 @@ Glossary NEMSIO A binary format for atmospheric model output from :term:`NCEP`'s Global Forecast System (GFS). + netCDF + NetCDF (Network Common Data Form) is a file format and community standard for storing multidimensional scientific data. It includes a set of software libraries and machine-independent data formats that support the creation, access, and sharing of array-oriented scientific data (see https://www.unidata.ucar.edu/software/netcdf/). + NUOPC The `National Unified Operational Prediction Capability `__ Layer "defines conventions and a set of generic components for building coupled models using the Earth System Modeling Framework (:term:`ESMF`)." @@ -142,6 +173,7 @@ Glossary The branch of physical geography dealing with mountains. Parameterization + Parameterizations Simplified functions that approximate the effects of small-scale processes (e.g., microphysics, gravity wave drag) that cannot be explicitly resolved by a model grid’s representation of the earth. RAP @@ -153,6 +185,10 @@ Glossary SDF Suite Definition File. An external file containing information about the construction of a physics suite. It describes the schemes that are called, in which order they are called, whether they are subcycled, and whether they are assembled into groups to be called together. + tracer + tracers + According to the American Meteorological Society (AMS) `definition `__, a tracer is "Any substance in the atmosphere that can be used to track the history [i.e., movement] of an air mass." Tracers are carried around by the motion of the atmosphere (i.e., by :term:`advection`). These substances are usually gases (e.g., water vapor, CO2), but they can also be non-gaseous (e.g., rain drops in microphysics parameterizations). In weather models, temperature (or potential temperature), absolute humidity, and radioactivity are also usually treated as tracers. According to AMS, "The main requirement for a tracer is that its lifetime be substantially longer than the transport process under study." + UFS The Unified Forecast System is a community-based, coupled comprehensive Earth modeling system consisting of several applications (apps). These apps span regional to global diff --git a/docs/UsersGuide/source/Graphics.rst b/docs/UsersGuide/source/Graphics.rst index 51f03f8a6d..6243696ce7 100644 --- a/docs/UsersGuide/source/Graphics.rst +++ b/docs/UsersGuide/source/Graphics.rst @@ -3,8 +3,8 @@ =================== Graphics Generation =================== -Two Python plotting scripts are provided to generate plots from the FV3-LAM post-processed GRIB2 -output over the CONUS for a number of variables, including: +Two Python plotting scripts are provided to generate plots from the FV3-LAM post-processed :term:`GRIB2` +output over the :term:`CONUS` for a number of variables, including: * 2-m temperature * 2-m dew point temperature @@ -13,24 +13,28 @@ output over the CONUS for a number of variables, including: * 250 hPa winds * Accumulated precipitation * Composite reflectivity -* Surface-based CAPE/CIN +* Surface-based :term:`CAPE`/:term:`CIN` * Max/Min 2-5 km updraft helicity * Sea level pressure (SLP) The Python scripts are located under ``ufs-srweather-app/regional_workflow/ush/Python``. The script ``plot_allvars.py`` plots the output from a single cycle within an experiment, while the script ``plot_allvars_diff.py`` plots the difference between the same cycle from two different -experiments (e.g. the experiments may differ in some aspect such as the physics suite used). If +experiments (e.g., the experiments may differ in some aspect such as the physics suite used). If plotting the difference, the two experiments must be on the same domain and available for the same cycle starting date/time and forecast hours. The Python scripts require a cycle starting date/time in YYYYMMDDHH format, a starting forecast -hour, an ending forecast hour, a forecast hour increment, the paths to one or two experiment directories, +hour, an ending forecast hour, a forecast hour increment, paths to one or two experiment directories, and a path to the directory where the Cartopy Natural Earth shape files are located. -The full set of Cartopy shape files can be downloaded at https://www.naturalearthdata.com/downloads/. +The full set of Cartopy shape files can be downloaded `here `. For convenience, the small subset of files required for these Python scripts can be obtained from the `EMC ftp data repository `_ or from `AWS cloud storage `_. + +.. + COMMENT: Update these links!!! + In addition, the Cartopy shape files are available on a number of Level 1 platforms in the following locations: @@ -64,9 +68,16 @@ On Gaea: /lustre/f2/pdata/esrl/gsd/ufs/NaturalEarth +On NOAA Cloud: + +.. code-block:: console + + /contrib/EPIC/NaturalEarth + + The medium scale (1:50m) cultural and physical shapefiles are used to create coastlines and other geopolitical borders on the map. Cartopy provides the ‘background_img()’ method to add background -images in a convenient way. The default scale (resolution) of background attributes in the Python +images in a convenient way. The default scale (resolution) of background attributes in the Python scripts is 1:50m Natural Earth I with Shaded Relief and Water, which should be sufficient for most regional applications. @@ -104,13 +115,21 @@ On Gaea: module use /lustre/f2/pdata/esrl/gsd/contrib/modulefiles module load miniconda3/4.8.3-regional-workflow +On NOAA Cloud: + +.. code-block:: console + + module use /contrib/GST/miniconda3/modulefiles + module load miniconda3/4.10.3 + conda activate regional_workflow + .. note:: If using one of the batch submission scripts described below, the user does not need to manually load an environment because the scripts perform this task. -Plotting output from one experiment -=================================== +Plotting Output from One Experiment +====================================== Before generating plots, it is convenient to change location to the directory containing the plotting scripts: @@ -124,24 +143,27 @@ following six command line arguments: #. Cycle date/time (``CDATE``) in YYYYMMDDHH format #. Starting forecast hour -#. Ending forecast hour +#. Ending forecast hour #. Forecast hour increment -#. The top level of the experiment directory ``EXPTDIR`` containing the post-processed data. The script will look for the data files in the directory ``EXPTDIR/CDATE/postprd``. -#. The base directory ``CARTOPY_DIR`` of the cartopy shapefiles. The script will look for the shape files (``*.shp``) in the directory ``CARTOPY_DIR/shapefiles/natural_earth/cultural``. +#. The top level of the experiment directory ``EXPTDIR`` containing the post-processed data. The script will look for the data files in the directory ``EXPTDIR/CDATE/postprd``. +#. The base directory ``CARTOPY_DIR`` of the cartopy shapefiles. The script will look for the shape files (``*.shp``) in the directory ``CARTOPY_DIR/shapefiles/natural_earth/cultural``. + +.. note:: + If a forecast starts at 18h, this is considered the 0th forecast hour, so "starting forecast hour" should be 0, not 18. An example of plotting output from a cycle generated using the sample experiment/workflow -configuration in the ``config.community.sh`` script (which uses the GFSv15p2 suite definition file) +configuration in the ``config.community.sh`` script (which uses the GFSv16 suite definition file) is as follows: .. code-block:: console - python plot_allvars.py 2019061500 6 48 6 /path-to/expt_dirs/test_CONUS_25km_GFSv15p2 /path-to/NaturalEarth + python plot_allvars.py 2019061500 6 48 6 /path-to/expt_dirs/test_CONUS_25km_GFSv16 /path-to/NaturalEarth -The output files (in .png format) will be located in the directory ``EXPTDIR/CDATE/postprd``, -where in this case ``EXPTDIR`` is ``/path-to/expt_dirs/test_CONUS_25km_GFSv15p2`` and ``CDATE`` +The output files (in ``.png`` format) will be located in the directory ``EXPTDIR/CDATE/postprd``, +where in this case ``EXPTDIR`` is ``/path-to/expt_dirs/test_CONUS_25km_GFSv16`` and ``CDATE`` is ``2019061500``. -Plotting differences from two experiments +Plotting Differences from Two Experiments ========================================= To generate difference plots, the ``plot_allvars_diff.py`` script must be called with the following @@ -151,66 +173,50 @@ seven command line arguments: #. Starting forecast hour #. Ending forecast hour #. Forecast hour increment -#. The top level of the first experiment directory ``EXPTDIR1`` containing the first set of post-processed data. The script will look for the data files in the directory ``EXPTDIR1/CDATE/postprd``. -#. The top level of the first experiment directory ``EXPTDIR2`` containing the second set of post-processed data. The script will look for the data files in the directory ``EXPTDIR2/CDATE/postprd``. -#. The base directory ``CARTOPY_DIR`` of the cartopy shapefiles. The script will look for the shape files (``*.shp``) in the directory ``CARTOPY_DIR/shapefiles/natural_earth/cultural``. +#. The top level of the first experiment directory ``EXPTDIR1`` containing the first set of post-processed data. The script will look for the data files in the directory ``EXPTDIR1/CDATE/postprd``. +#. The top level of the first experiment directory ``EXPTDIR2`` containing the second set of post-processed data. The script will look for the data files in the directory ``EXPTDIR2/CDATE/postprd``. +#. The base directory ``CARTOPY_DIR`` of the cartopy shapefiles. The script will look for the shape files (``*.shp``) in the directory ``CARTOPY_DIR/shapefiles/natural_earth/cultural``. -An example of plotting differences from two experiments for the same date and predefined domain where one uses -the "FV3_GFS_v15p2" suite definition file (SDF) and one using the "FV3_RRFS_v1alpha" SDF is as follows: +An example of plotting differences from two experiments for the same date and predefined domain where one uses the "FV3_GFS_v16" suite definition file (SDF) and one using the "FV3_RRFS_v1beta" SDF is as follows: .. code-block:: console - python plot_allvars_diff.py 2019061518 6 18 3 /path-to/expt_dirs1/test_CONUS_3km_GFSv15p2 /path-to/expt_dirs2/test_CONUS_3km_RRFSv1alpha /path-to/NaturalEarth + python plot_allvars_diff.py 2019061518 0 18 6 /path-to/expt_dirs1/test_CONUS_3km_GFSv16 /path-to/expt_dirs2/test_CONUS_3km_RRFSv1beta /path-to/NaturalEarth -In this case, the output png files will be located in the directory ``EXPTDIR1/CDATE/postprd``. +In this case, the output ``.png`` files will be located in the directory ``EXPTDIR1/CDATE/postprd``. -Submitting plotting scripts through a batch system -================================================== +Submitting Plotting Scripts Through a Batch System +====================================================== -If the Python scripts are being used to create plots of multiple forecast lead times and forecast -variables, then you may need to submit them to the batch system. Example scripts are provided called -``sq_job.sh`` and ``sq_job_diff.sh`` for use on a platform such as Hera that uses the Slurm -job scheduler or ``qsub_job.sh`` and ``qsub_job_diff.sh`` for use on a platform such as -Cheyenne that uses PBS as the job scheduler. Examples of these scripts are located under -``ufs-srweather-app/regional_workflow/ush/Python`` and can be used as a starting point to create a batch script -for your platform/job scheduler of use. +If users plan to create plots of multiple forecast lead times and forecast variables, then they may need to submit the Python scripts to the batch system. Sample scripts are provided for use on a platform such as Hera that uses the Slurm job scheduler: ``sq_job.sh`` and ``sq_job_diff.sh``. Equivalent sample scripts are provided for use on a platform such as Cheyenne that uses PBS as the job scheduler: ``qsub_job.sh`` and ``qsub_job_diff.sh``. Examples of these scripts are located under ``ufs-srweather-app/regional_workflow/ush/Python`` and can be used as a starting point to create a batch script for a user's specific platform/job scheduler. At a minimum, the account should be set appropriately prior to job submission: .. code-block:: console - #SBATCH --account=an_account + #SBATCH --account= -Depending on the platform you are running on, you may also need to adjust the settings to use -the correct Python environment and path to the shape files. +Depending on the platform, users may also need to adjust the settings to use the correct Python environment and path to the shape files. -When using these batch scripts, several environment variables must be set prior to submission. +When working with these batch scripts, several environment variables must be set prior to submission. If plotting output from a single cycle, the variables to set are ``HOMErrfs`` and ``EXPTDIR``. -In this case, if the user's login shell is csh/tcsh, these variables can be set as follows: - -.. code-block:: console - - setenv HOMErrfs /path-to/ufs-srweather-app/regional_workflow - setenv EXPTDIR /path-to/experiment/directory - -If the user's login shell is bash, they can be set as follows: +If the user's login shell is bash, these variables can be set as follows: .. code-block:: console export HOMErrfs=/path-to/ufs-srweather-app/regional_workflow export EXPTDIR=/path-to/experiment/directory -If plotting the difference between the same cycle from two different experiments, the variables -to set are ``HOMErrfs``, ``EXPTDIR1``, and ``EXPTDIR2``. In this case, if the user's login shell -is csh/tcsh, these variables can be set as follows: +If the user's login shell is csh/tcsh, they can be set as follows: .. code-block:: console setenv HOMErrfs /path-to/ufs-srweather-app/regional_workflow - setenv EXPTDIR1 /path-to/experiment/directory1 - setenv EXPTDIR2 /path-to/experiment/directory2 + setenv EXPTDIR /path-to/experiment/directory -If the user's login shell is bash, they can be set as follows: +If plotting the difference between the same cycle from two different experiments, the variables +to set are ``HOMErrfs``, ``EXPTDIR1``, and ``EXPTDIR2``. If the user's login shell +is bash, these variables can be set as follows: .. code-block:: console @@ -218,21 +224,29 @@ If the user's login shell is bash, they can be set as follows: export EXPTDIR1=/path-to/experiment/directory1 export EXPTDIR2=/path-to/experiment/directory2 +If the user's login shell is csh/tcsh, they can be set as follows: + +.. code-block:: console + + setenv HOMErrfs /path-to/ufs-srweather-app/regional_workflow + setenv EXPTDIR1 /path-to/experiment/directory1 + setenv EXPTDIR2 /path-to/experiment/directory2 + In addition, the variables ``CDATE``, ``FCST_START``, ``FCST_END``, and ``FCST_INC`` in the batch -scripts can be modified depending on the user's needs. By default, ``CDATE`` is set as follows +scripts can be modified depending on the user's needs. By default, ``CDATE`` is set as follows in the batch scripts: .. code-block:: console export CDATE=${DATE_FIRST_CYCL}${CYCL_HRS} -This sets ``CDATE`` to the first cycle in the set of cycles that the experiment has run. If the +This sets ``CDATE`` to the first cycle in the set of cycles that the experiment has run. If the experiment contains multiple cycles and the user wants to plot output from a cycle other than the very first one, ``CDATE`` in the batch scripts will have to be set to the specific YYYYMMDDHH -value for that cycle. Also, to plot hourly forecast output, ``FCST_INC`` should be set to 1; to +value for that cycle. Also, to plot hourly forecast output, ``FCST_INC`` should be set to 1; to plot only a subset of the output hours, ``FCST_START``, ``FCST_END``, and ``FCST_INC`` must be -set accordingly, e.g. to generate plots for every 6th forecast hour starting with forecast hour 6 -and ending with the last forecast hour, use +set accordingly, e.g., to generate plots for every 6th forecast hour starting with forecast hour 6 +and ending with the last forecast hour, use: .. code-block:: console @@ -240,9 +254,7 @@ and ending with the last forecast hour, use export FCST_END=${FCST_LEN_HRS} export FCST_INC=6 -The scripts must be submitted using the command appropriate -for the job scheduler used on your platform. For example, on Hera, -``sq_job.sh`` can be submitted as follows: +The scripts must be submitted using the command appropriate for the job scheduler used on the user's platform. For example, on Hera, ``sq_job.sh`` can be submitted as follows: .. code-block:: console diff --git a/docs/UsersGuide/source/Include-HPCInstall.rst b/docs/UsersGuide/source/Include-HPCInstall.rst index b467d96d23..e9d5e2d482 100644 --- a/docs/UsersGuide/source/Include-HPCInstall.rst +++ b/docs/UsersGuide/source/Include-HPCInstall.rst @@ -1,6 +1,7 @@ .. _InstallHPCstack: .. include:: ../../../hpc-stack-mod/docs/source/hpc-install.rst +.. include:: ../../../hpc-stack-mod/docs/source/mac-install.rst .. include:: ../../../hpc-stack-mod/docs/source/hpc-prereqs.rst .. include:: ../../../hpc-stack-mod/docs/source/hpc-parameters.rst diff --git a/docs/UsersGuide/source/InputOutputFiles.rst b/docs/UsersGuide/source/InputOutputFiles.rst index cfda134b60..8e56abd42a 100644 --- a/docs/UsersGuide/source/InputOutputFiles.rst +++ b/docs/UsersGuide/source/InputOutputFiles.rst @@ -4,7 +4,7 @@ Input and Output Files ======================= This chapter provides an overview of the input and output files needed by the components -of the UFS SRW Application (i.e., :term:`UFS_UTILS`, the UFS :term:`Weather Model`, and the :term:`UPP`). Links to more detailed documentation for each of the components are provided. For SRW App users who want to jump straight to downloading and staging the files, see :numref:`Section %s `. +of the UFS SRW Application. Links to more detailed documentation for each of the components (e.g., UFS_UTILS, the UFS Weather Model, and the UPP) are provided in the sections below. For SRW App users who want to jump straight to downloading and staging the files, see :numref:`Section %s `. .. _Input: @@ -16,21 +16,23 @@ conditions files, and model configuration files (such as namelists). Initial and Boundary Condition Files ------------------------------------ -The external model files needed for initializing the runs can be obtained in a number of -ways, including: pulled directly from `NOMADS `_; -limited data availability), pulled from the NOAA HPSS during the workflow execution (requires -user access), or obtained and staged by the user from a different source. The data format for -these files can be :term:`GRIB2` or :term:`NEMSIO`. More information on downloading and setting up -the external model data can be found in :numref:`Section %s `. Once the data is set up, the end-to-end application will run the system and write output files to disk. +The external model files needed for initializing an experiment can be obtained in a number of +ways, including: + + * pulled directly from `NOMADS `_ (limited timespan for data availability), + * pulled from the NOAA High Performance Storage System (HPSS) during the workflow execution (requires user access), or + * obtained and staged by the user from a different source. + +The data format for these files can be :term:`GRIB2` or :term:`NEMSIO`. More information on downloading and setting up the external model data can be found in :numref:`Section %s `. Once the data is set up, the end-to-end application will run the system and write output files to disk. Pre-processing (UFS_UTILS) -------------------------- -When a user runs the SRW Application as described in the Quick Start Guide :numref:`Chapter %s `, :numref:`Step %s Generate the Forecast Experiment ` links the input data for the pre-processing utilities from a location on disk to the experiment directory. The pre-processing utilities use many different datasets to create grids and to generate model input datasets from the external model files. A detailed description of the input files for the pre-processing utilities can be found `here `__. +When a user generates the regional workflow, as described in :numref:`Step %s ` of the Quick Start Guide, the workflow generation script links the input data for the pre-processing utilities to the experiment directory. The pre-processing utilities use many different datasets to create grids and to generate model input datasets from the external model files. A detailed description of the input files for the pre-processing utilities can be found in the `UFS_UTILS Documentation `__. UFS Weather Model ----------------- -The input files for the weather model include both static (fixed) files and grid- and date-specific files (terrain, initial conditions, boundary conditions, etc). The static fix files -must be staged by the user unless you are running on a Level 1/pre-configured platform, in which case you can link to the existing copy of the data on that machine. See :numref:`Section %s ` for more information. The static, grid, and date-specific files are linked in the experiment directory by the workflow scripts. An extensive description of the input files for the weather model can be found in the `UFS Weather Model User's Guide `__. The namelists and configuration files for the SRW Application are created from templates by the workflow, as described in :numref:`Section %s `. +The input files for the Weather Model include both static (fixed) files and grid- and date-specific files (terrain, initial conditions, boundary conditions, etc). The static fix files +must be staged by the user unless the user is running on a `Level 1/pre-configured `__ platform, in which case users can link to the existing copy of the data on their machine. See :numref:`Section %s ` for instructions. The workflow scripts link the static, grid, and date-specific files in the experiment directory. An extensive description of the input files for the Weather Model can be found in the `UFS Weather Model User's Guide `__. The namelists and configuration files for the SRW Application are created from templates by the workflow generation script, as described in :numref:`Section %s `. Unified Post Processor (UPP) ---------------------------- @@ -42,8 +44,7 @@ Documentation for the UPP input files can be found in the `UPP User's Guide Workflow -------- The SRW Application uses a series of template files, combined with user-selected settings, -to create the required namelists and parameter files needed by the Application. These -templates can be reviewed to see what defaults are being used and where configuration parameters from the ``config.sh`` file are assigned. +to create the required namelists and parameter files needed by the Application workflow. (See :numref:`Figure %s ` for a visual summary of the workflow generation process, including template use.) These templates can be reviewed to see which defaults are used and where configuration parameters from the ``config.sh`` file are assigned. List of Template Files ^^^^^^^^^^^^^^^^^^^^^^ @@ -52,70 +53,72 @@ and are shown in :numref:`Table %s `. .. _TemplateFiles: -.. table:: Template Files for a Regional Workflow - - +-----------------------------+-------------------------------------------------------------+ - | **File Name** | **Description** | - +=============================+=============================================================+ - | data_table | Cycle-independent file that the forecast model reads in at | - | | the start of each forecast. It is an empty file. No need to | - | | change. | - +-----------------------------+-------------------------------------------------------------+ - | diag_table_[CCPP] | File specifying the output fields of the forecast model. | - | | A different diag_table may be configured for different | - | | CCPP suites. | - +-----------------------------+-------------------------------------------------------------+ - | field_table_[CCPP] | Cycle-independent file that the forecast model reads in at | - | | the start of each forecast. It specifies the tracers that | - | | the forecast model will advect. A different field_table | - | | may be needed for different CCPP suites. | - +-----------------------------+-------------------------------------------------------------+ - | FV3.input.yml | YAML configuration file containing the forecast model’s | - | | namelist settings for various physics suites. The values | - | | specified in this file update the corresponding values in | - | | the ``input.nml`` file. This file may be modified for the | - | | specific namelist options of your experiment. | - +-----------------------------+-------------------------------------------------------------+ - | FV3LAM_wflow.xml | Rocoto XML file to run the workflow. It is filled in using | - | | the ``fill_template.py`` python script that is called in | - | | the ``generate_FV3LAM_wflow.sh``. | - +-----------------------------+-------------------------------------------------------------+ - | input.nml.FV3 | Namelist file of the weather model. | - +-----------------------------+-------------------------------------------------------------+ - | model_configure | Settings and configurations for the NUOPC/ESMF main | - | | component. | - +-----------------------------+-------------------------------------------------------------+ - | nems.configure | NEMS (NOAA Environmental Modeling System) configuration | - | | file, no need to change because it is an atmosphere-only | - | | model in the SRW Application. | - +-----------------------------+-------------------------------------------------------------+ - | regional_grid.nml | Namelist settings for the code that generates an ESG grid. | - +-----------------------------+-------------------------------------------------------------+ - | README.xml_templating.md | Instruction of Rocoto XML templating with Jinja. | - +-----------------------------+-------------------------------------------------------------+ - -Additional information related to the ``diag_table_[CCPP]``, ``field_table_[CCPP]``, ``input.nml.FV3``, ``model_conigure``, and ``nems.configure`` can be found in the `UFS Weather Model User's Guide `__, -while information on the ``regional_grid.nml`` can be found in the `UFS_UTILS User’s Guide +.. table:: Template Files for the Regional Workflow + + +-----------------------------+--------------------------------------------------------------+ + | **File Name** | **Description** | + +=============================+==============================================================+ + | data_table | :term:`Cycle-independent` file that the forecast model | + | | reads in at the start of each forecast. It is an empty file. | + | | No need to change. | + +-----------------------------+--------------------------------------------------------------+ + | diag_table_[CCPP] | File specifying the output fields of the forecast model. | + | | A different ``diag_table`` may be configured for different | + | | :term:`CCPP` suites. | + +-----------------------------+--------------------------------------------------------------+ + | field_table_[CCPP] | :term:`Cycle-independent` file that the forecast model | + | | reads in at the start of each forecast. It specifies the | + | | :term:`tracers` that the forecast model will :term:`advect`. | + | | A different ``field_table`` may be needed for different | + | | CCPP suites. | + +-----------------------------+--------------------------------------------------------------+ + | FV3.input.yml | YAML configuration file containing the forecast model's | + | | namelist settings for various physics suites. The values | + | | specified in this file update the corresponding values in | + | | the ``input.nml`` file. This file may be modified for the | + | | specific namelist options of your experiment. | + +-----------------------------+--------------------------------------------------------------+ + | FV3LAM_wflow.xml | Rocoto XML file to run the workflow. It is filled in using | + | | the ``fill_template.py`` python script that is called in | + | | ``generate_FV3LAM_wflow.sh``. | + +-----------------------------+--------------------------------------------------------------+ + | input.nml.FV3 | Namelist file for the Weather Model. | + +-----------------------------+--------------------------------------------------------------+ + | model_configure | Settings and configurations for the | + | | :term:`NUOPC`/:term:`ESMF` main component. | + +-----------------------------+--------------------------------------------------------------+ + | nems.configure | :term:`NEMS` (NOAA Environmental Modeling System) | + | | configuration file. No need to change because it is an | + | | atmosphere-only model in the SRW Application. | + +-----------------------------+--------------------------------------------------------------+ + | regional_grid.nml | Namelist settings for the code that generates an :term:`ESG` | + | | grid. | + +-----------------------------+--------------------------------------------------------------+ + | README.xml_templating.md | Instructions for Rocoto XML templating with Jinja. | + +-----------------------------+--------------------------------------------------------------+ + +Additional information related to ``diag_table_[CCPP]``, ``field_table_[CCPP]``, ``input.nml.FV3``, ``model_conigure``, and ``nems.configure`` can be found in the `UFS Weather Model User's Guide `__, +while information on ``regional_grid.nml`` can be found in the `UFS_UTILS User's Guide `_. Migratory Route of the Input Files in the Workflow ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -:numref:`Figure %s ` shows how the case-specific input files in the -``ufs-srweather-app/regional_workflow/ush/templates/`` directory flow to the experiment directory. The value of ``CCPP_PHYS_SUITE`` is specified in the configuration file ``config.sh``. The template input files corresponding to ``CCPP_PHYS_SUITE``, such as ``field_table`` and ``nems_configure``, are copied to the experiment directory ``EXPTDIR``, and the namelist file of the weather model ``input.nml`` is created from the ``input.nml.FV3`` and ``FV3.input.yml`` files by running the script ``generate_FV3LAM_wflow.sh``. While running the task ``RUN_FCST`` in the regional workflow as shown in :numref:`Figure %s `, the ``field_table``, ``nems.configure``, and ``input.nml`` files, located in ``EXPTDIR``, are linked to the cycle directory ``CYCLE_DIR/``. Additionally, ``diag_table`` and ``model_configure`` are copied from the ``templates`` directory. Finally, these files are updated with the variables specified in ``var_defn.sh``. +:numref:`Figure %s ` shows how the input files in the template directory (``ufs-srweather-app/regional_workflow/ush/templates/``) flow to the experiment directory. First, the CCPP physics suite is specified in the configuration file. The template input files corresponding to the selected physics suite, such as ``field_table_[CCPP]`` and ``nems.configure_[CCPP]``, are copied to the experiment directory (``$EXPTDIR``). Additionally, the namelist file of the Weather Model (``input.nml``) is created from the ``input.nml.FV3`` and ``FV3.input.yml`` files by running the workflow generation script. While running the ``RUN_FCST`` task in the regional workflow as shown in :numref:`Figure %s `, the ``field_table``, ``nems.configure``, and ``input.nml`` files, located in ``$EXPTDIR``, are linked to the cycle directory ``$CYCLE_DIR``. Additionally, ``diag_table`` and ``model_configure`` are copied from the ``templates`` directory. Finally, these files are updated with the variables specified in ``var_defn.sh``. .. _MigratoryRoute: .. figure:: _static/FV3LAM_wflow_input_path.png + :alt: Flowchart showing how information on the physics suite travels from the config shell file to the setup shell file to the workflow generation script to the run forecast ex-script. As this information is fed from one script to the next, file paths and variables required for workflow execution are set. - *Migratory route of input files* + *Migratory route of input files* .. _OutputFiles: Output Files -============ +============== -The location of the output files written to disk is defined by the experiment directory, -``EXPTDIR/YYYYMMDDHH``, as set in ``config.sh``. +The location of the output files written to disk is within a subdirectory of the experiment directory, +``EXPTDIR/YYYYMMDDHH``, named based on the settings in ``config.sh``. Initial and boundary condition files ------------------------------------ @@ -130,137 +133,167 @@ experiment run directory ``EXPTDIR/YYYYMMDDHH/INPUT`` and consist of the followi * ``C403_grid.tile7.halo3.nc`` * ``gfs_bndy.tile7.000.nc`` * ``gfs_bndy.tile7.006.nc`` +* ``gfs_bndy.tile7.012.nc`` * ``gfs_ctrl.nc`` -* ``gfs_data.nc -> gfs_data.tile7.halo0.nc`` -* ``grid_spec.nc -> ../../grid/C403_mosaic.halo3.nc`` -* ``grid.tile7.halo4.nc -> ../../grid/C403_grid.tile7.halo4.nc`` -* ``oro_data.nc -> ../../orog/C403_oro_data.tile7.halo0.nc`` +* ``gfs_data.nc`` +* ``gfs_data.tile7.halo0.nc`` +* ``grid_spec.nc`` +* ``grid.tile7.halo4.nc`` +* ``oro_data.nc`` +* ``oro_data.tile7.halo4.nc`` * ``sfc_data.nc -> sfc_data.tile7.halo0.nc`` +* ``sfc_data.tile7.halo0.nc`` +* ``tmp_ICS`` +* ``tmp_LBCS`` -These output files are used as inputs for the UFS weather model, and are described in the `Users Guide +These output files are used as inputs for the UFS Weather Model, and are described in the `UFS Weather Model User's Guide `__. .. COMMENT: Change link above (structure of "latest" is significantly different) UFS Weather Model ------------------ -As mentioned previously, the workflow can be run in ‘community’ or ‘nco’ mode, which determines -the location and names of the output files. In addition to this option, output can also be in -netCDF or NEMSIO format. The output file format is set in the ``model_configure`` files using the -``output_file`` variable. At this time, due to limitations in the post-processing component, only netCDF format output is recommended for the SRW Application. +------------------ +As stated in :numref:`Section %s `, the workflow can be run in 'community' or 'nco' mode, which determines the location and names of the output files. Weather Model output files can also be in :term:`netCDF` or :term:`NEMSIO` format. The output file format is set in the ``model_configure`` file (see :numref:`Table %s `) using the ``output_file`` variable. At this time, due to limitations in the post-processing component, only netCDF output is recommended for the SRW Application. .. note:: - In summary, the fully supported options for this release include running in ‘community’ mode with netCDF format output files. + The fully supported options for this release include running in 'community' mode with netCDF-formatted output files. In this case, the netCDF output files are written to the ``EXPTDIR/YYYYMMDDHH`` directory. The bases of the file names are specified in the input file ``model_configure`` and are set to the following in the SRW Application: * ``dynfHHH.nc`` * ``phyfHHH.nc`` -Additional details may be found in the UFS Weather Model `Users Guide +where HHH corresponds to the 3-digit forecast hour (e.g., ``dynf006.nc`` for the 6th hour of the forecast). Additional details may be found in the `UFS Weather Model User's Guide `__. Unified Post Processor (UPP) ---------------------------- -Documentation for the UPP output files can be found `here `__. +Documentation for the UPP output files can be found in the `UPP User's Guide `__. For the SRW Application, the weather model netCDF output files are written to the ``EXPTDIR/YYYYMMDDHH/postprd`` directory and have the naming convention (file->linked to): -* ``BGRD3D_{YY}{JJJ}{hh}{mm}f{fhr}00 -> {domain}.t{cyc}z.bgrd3df{fhr}.tmXX.grib2`` -* ``BGDAWP_{YY}{JJJ}{hh}{mm}f{fhr}00 -> {domain}.t{cyc}z.bgdawpf{fhr}.tmXX.grib2`` +* ``NATLEV_{YY}{JJJ}{hh}{mm}f{fhr}00 -> {domain}.t{cyc}z.natlevf{fhr}.tmXX.grib2`` +* ``PRSLEV_{YY}{JJJ}{hh}{mm}f{fhr}00 -> {domain}.t{cyc}z.prslevf{fhr}.tmXX.grib2`` -The default setting for the output file names uses ``rrfs`` for ``{domain}``. This may be overridden by the user in the ``config.sh`` settings. +The default setting for the output file names uses ``rrfs`` for ``{domain}``. This may be overridden by the user in the ``config.sh`` settings. -If you wish to modify the fields or levels that are output from the UPP, you will need to make modifications to file ``fv3lam.xml``, which resides in the UPP repository distributed with the UFS SRW Application. Specifically, if the code was cloned in the directory ``ufs-srweather-app``, the file will be located in ``ufs-srweather-app/src/UPP/parm``. +Modifying the UPP Output +^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +If users wish to modify the fields or levels that are output from the UPP, they will need to make modifications to ``fv3lam.xml``, which resides in the UPP repository distributed with the UFS SRW Application. If the code was cloned into the directory ``ufs-srweather-app``, the file will be located in ``ufs-srweather-app/src/UPP/parm``. .. note:: This process requires advanced knowledge of which fields can be output for the UFS Weather Model. -Use the directions in the `UPP User's Guide `__ for details on how to make modifications to the ``fv3lam.xml`` file and for remaking the flat text file that the UPP reads, which is called ``postxconfig-NT-fv3lam.txt`` (default). +UPP Product Output Tables for the UFS SRW LAM Grid: + * :doc:`3D Native Hybrid Level Fields ` + * :doc:`3D Pressure Level Fields ` + +Use the instructions in the `UPP User's Guide `__ to make modifications to the ``fv3lam.xml`` file and to remake the flat text file that the UPP reads, which is called ``postxconfig-NT-fv3lam.txt`` (default). Once you have created the new flat text file reflecting your changes, you will need to modify your ``config.sh`` to point the workflow to the new text file. In your ``config.sh``, set the following: .. code-block:: console - USE_CUSTOM_POST_CONFIG_FILE=”TRUE” - CUSTOM_POST_CONFIG_PATH=”” + USE_CUSTOM_POST_CONFIG_FILE="TRUE" + CUSTOM_POST_CONFIG_PATH="" -which tells the workflow to use the custom file located in the user-defined path. The path should include the filename. If this is set to true and the file path is not found, then an error will occur when trying to generate the SRW Application workflow. +which tells the workflow to use the custom file located in the user-defined path. The path should include the filename. If this is set to true, and the file path is not found, then an error will occur when trying to generate the SRW Application workflow. -You may then start your case workflow as usual and the UPP will use the new flat ``*.txt`` file. +Users may then start their experiment workflow as usual and the UPP will use the new flat ``*.txt`` file. -.. _DownloadingStagingInput: +.. _SatelliteProducts: -Downloading and Staging Input Data -================================== -A set of input files, including static (fix) data and raw initial and lateral boundary conditions (:term:`IC/LBC`'s), are needed to run the SRW Application. +Outputting Satellite Products from UPP +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -.. _StaticFixFiles: +Synthetic satellite products for several instruments and channels (e.g. GOES 16/17) may be output through the UPP using the Community Radiative Transfer Model (:term:`CRTM`). External CRTM coefficient files, available through the UPP stand-alone release, will need to be manually downloaded before running the workflow. These instructions assume that the UPP configuration file has already been set up to output satellite products. -Static Files ------------- -The environment variables ``FIXgsm``, ``TOPO_DIR``, and ``SFC_CLIMO_INPUT_DIR`` indicate the path to -the directories where the static files are located. If you are on a pre-configured or configurable platform (i.e., a Level 1 or 2 platform), there is no need to stage the fixed files manually because they have been prestaged, and the paths are set in ``regional_workflow/ush/setup.sh``. On Level 3 & 4 systems, the static files can be downloaded individually or as a full tar file from the `FTP data repository `__ or from `Amazon Web Services (AWS) cloud storage `__ using the ``wget`` command. Then ``tar -xf `` will extract the compressed file: +Download and unpack the external files: .. code-block:: console - wget https://ufs-data.s3.amazonaws.com/public_release/ufs-srweather-app-v1.0.0/fix/fix_files.tar.gz - tar -xf fix_files.tar.gz + mkdir crtm && cd crtm + wget https://github.com/NOAA-EMC/EMC_post/releases/download/upp_v10.1.0/fix.tar.gz + tar -xzf fix.tar.gz -The paths to the staged files must then be set in ``config.sh``. Add the following code or alter the variable paths if they are already listed in the ``config.sh`` file: +Modify the ``config.sh`` file to include the following lines: -* ``FIXgsm=/path-to/fix/fix_am`` -* ``TOPO_DIR=/path-to/fix/fix_am/fix_orog`` -* ``SFC_CLIMO_INPUT_DIR=/path-to/fix_am/fix/sfc_climo/`` +.. code-block:: console -.. _InitialConditions: + USE_CRTM="TRUE" + CRTM_DIR="/path/to/top/crtm/dir" -Initial Condition Formats and Source ------------------------------------- -The SRW Application currently supports raw initial and lateral boundary conditions from numerous models (i.e., FV3GFS, NAM, RAP, HRRR). The data can be provided in three formats: :term:`NEMSIO`, netCDF, or :term:`GRIB2`. The SRW Application currently only supports the use of NEMSIO and netCDF input files from the GFS. +By setting ``USE_CRTM`` to "TRUE", the workflow will use the path defined in ``CRTM_DIR`` to link the necessary coefficient files to the working directory at runtime. Otherwise, it is assumed that no satellite fields are being requested in the UPP configuration. ``CRTM_DIR`` should point to the top CRTM directory where the fix files are located. + +.. note:: + Dependencies for outputting synthetic satellite products may exist based on model configuration (e.g. model physics). -The data required to run the "out-of-the-box" SRW App case described in :numref:`Chapter %s ` is already preinstalled on `Level 1 `__ systems. Users on other systems can find the required IC/LBC data in the `FTP data repository `__ or on `AWS cloud storage `_. -To add this data to your system, run the following commands from the ``ufs-srweather-app`` directory: +.. _DownloadingStagingInput: + +Downloading and Staging Input Data +================================== +A set of input files, including static (fix) data and raw initial and lateral boundary conditions (:term:`IC/LBC`'s), is required to run the SRW Application. The data required for the "out-of-the-box" SRW App case described in Chapters :numref:`%s ` and :numref:`%s ` is already preinstalled on `Level 1 & 2 `__ systems, along with data required to run the :ref:`WE2E ` test cases. These users do not need to stage the fixed files manually because they have been prestaged, and the paths are set in ``regional_workflow/ush/setup.sh``. + +Users on Level 3 & 4 systems can find the data required for the "out-of-the-box" SRW App case in the `UFS SRW App Data Bucket `__ by clicking on `Browse Bucket `__. In general, files required for a particular experiment must be downloaded individually. However, the input data required to run the default SRW App case from ``config.community.sh`` is available as a full tar file. To download the tar file: .. code-block:: console - wget https://ftp.emc.ncep.noaa.gov/EIB/UFS/SRW/v1p0/simple_test_case/gst_model_data.tar.gz - tar -xf gst_model_data.tar.gz + wget https://noaa-ufs-srw-pds.s3.amazonaws.com/current_srw_release_data + tar -xf files.tar.gz -This will extract the files and place them within a new ``model_data`` directory inside the ``ufs-srweather-app``. -Then, the paths to ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` must be set in the ``config.sh`` file. +.. _StaticFixFiles: -.. code-block:: console +Static Files +-------------- - cd - vi config.sh +Many static files are available in the `"fix" directory `__ of the SRW Data Bucket. If users prefer not to download the tar file with the current release data in :numref:`Section %s ` above, they can download static files individually from the `"fix" directory `__ of the SRW Data Bucket using the ``wget`` command for each required file. -Then, in ``config.sh``, set the following environment variables: +The environment variables ``FIXgsm``, ``TOPO_DIR``, and ``SFC_CLIMO_INPUT_DIR`` indicate the path to +the directories where the static files are located. After downloading the experiment data, users must set the paths to the files in ``config.sh``. Add the following code or alter the variable paths if they are already listed in the ``config.sh`` file: + +* ``FIXgsm="/path-to/fix/fix_am"`` +* ``TOPO_DIR="/path-to/fix/fix_am/fix_orog"`` +* ``SFC_CLIMO_INPUT_DIR="/path-to/fix_am/fix/sfc_climo/"`` + +.. _InitialConditions: + +Inition Condition/Lateral Boundary Condition File Formats and Source +----------------------------------------------------------------------- +The SRW Application currently supports raw initial and lateral boundary conditions from numerous models (i.e., FV3GFS, NAM, RAP, HRRR). The data can be provided in three formats: :term:`NEMSIO`, :term:`netCDF`, or :term:`GRIB2`. + +To download the model input data for the 12-hour "out-of-the-box" experiment configuration in ``config.community.sh`` file, run: .. code-block:: console - USE_USER_STAGED_EXTRN_FILES=TRUE - EXTRN_MDL_SOURCE_BASEDIR_ICS= - EXTRN_MDL_SOURCE_BASEDIR_LBCS= + wget https://noaa-ufs-srw-pds.s3.amazonaws.com/input_model_data/FV3GFS/grib2/2019061518/gfs.t18z.pgrb2.0p25.f000 + wget https://noaa-ufs-srw-pds.s3.amazonaws.com/input_model_data/FV3GFS/grib2/2019061518/gfs.t18z.pgrb2.0p25.f006 + +.. + COMMENT: Add the following line once PR #766 goes through & Data Bucket is updated: + wget https://noaa-ufs-srw-pds.s3.amazonaws.com/input_model_data/FV3GFS/grib2/2019061518/gfs.t18z.pgrb2.0p25.f012 -These environment variables describe what :term:`IC/LBC` files to use (pre-staged files or files to be automatically pulled from the NOAA HPSS) and the location of the IC/LBC files. ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` is the directory where the initial conditions are located, and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` is the directory where the lateral boundary conditions are located. +To download data for different models or in different formats, users can explore the data bucket and replace the links above with ones that fetch their desired data. Initial and Lateral Boundary Condition Organization --------------------------------------------------- -The suggested directory structure and naming convention for the raw input files is described -below. While there is flexibility to modify these settings, this will provide the most reusability for multiple dates when using the SRW Application workflow. -For ease of reusing the ``config.sh`` for multiple dates and cycles, it is recommended to set up your raw :term:`IC/LBC` files such that it includes the model name (e.g., FV3GFS, NAM, RAP, HRRR) and ``YYYYMMDDHH``, for example: ``/path-to/model_data/FV3GFS/2019061518``. Since both initial and lateral boundary condition files are necessary, you can also include an ICS and LBCS directory. The sample IC/LBC's available at the FTP data repository are structured as follows: +The paths to ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` must be set in the ``config.sh`` file as follows: -* ``/path-to/model_data/MODEL/YYYYMMDDHH/ICS`` -* ``/path-to/model_data/MODEL/YYYYMMDDHH/LBCS`` +.. code-block:: console + + USE_USER_STAGED_EXTRN_FILES="TRUE" + EXTRN_MDL_SOURCE_BASEDIR_ICS="" + EXTRN_MDL_SOURCE_BASEDIR_LBCS="" -When files are pulled from the NOAA HPSS, the naming convention looks something like: +These last two variables describe where the :term:`IC` and :term:`LBC` file directories are located, respectively. For ease of reusing ``config.sh`` across experiments, it is recommended that users set up the raw :term:`IC/LBC` file paths to include the model name (e.g., FV3GFS, NAM, RAP, HRRR) and date (in ``YYYYMMDDHH`` format). In addition, users can include separate ICS and LBCS directories. For example: ``/path-to/model_data/FV3GFS/2019061518/ICS`` and ``/path-to/model_data/FV3GFS/2019061518/LBCS``. While there is flexibility to modify these settings, this structure will provide the most reusability for multiple dates when using the SRW Application workflow. + +When files are pulled from NOAA :term:`HPSS` (rather than downloaded from the data bucket), the naming convention looks something like: * FV3GFS (GRIB2): ``gfs.t{cycle}z.pgrb2.0p25.f{fhr}`` * FV3GFS (NEMSIO): @@ -269,6 +302,8 @@ When files are pulled from the NOAA HPSS, the naming convention looks something * RAP (GRIB2): ``rap.t{cycle}z.wrfprsf{fhr}.grib2`` * HRRR (GRIB2): ``hrrr.t{cycle}z.wrfprsf{fhr}.grib2`` +where {cycle} corresponds to the 2-digit hour of the day when the forecast cycle starts, and {fhr} corresponds to the nth hour of the forecast. For example, a forecast starting at 18h00 UTC would have a {cycle} value of 18, which is the 0th forecast hour. The LBCS file for 21h00 UTC would be named ``gfs.t18z.pgrb2.0p25.f03``. + In order to preserve the original file name, the ``f00`` files are placed in the ``ICS`` directory and all other forecast files are placed in the ``LBCS`` directory. Then, a symbolic link of the original files in the ``ICS/LBCS`` directory to the ``YYYYMMDDHH`` directory is suggested with @@ -278,7 +313,7 @@ the cycle removed. For example: ln -sf /path-to/model_data/RAP/2020041212/ICS/rap.t12z.wrfprsf00.grib2 /path-to/model_data/RAP/2020041212/rap.wrfprsf00.grib2 -Doing this allows for the following to be set in the ``config.sh`` regardless of what cycle you are running: +Linking the files like this removes the cycle-dependent part of the file names so that each cycle will use the same file name, regardless of initialization time. This allows for the following to be set in the ``config.sh`` regardless of what cycle you are running: .. code-block:: console @@ -288,7 +323,7 @@ Doing this allows for the following to be set in the ``config.sh`` regardless of EXTRN_MDL_SOURCE_BASEDIR_LBCS="/path-to/model_data/RAP" EXTRN_MDL_FILES_LBCS=( "rap.wrfprsf03.grib2" "rap.wrfprsf06.grib2" ) -If you choose to forgo the extra ``ICS`` and ``LBCS`` directory, you may either +If users choose to forgo the extra ``ICS`` and ``LBCS`` directory, they may either rename the original files to remove the cycle or modify the ``config.sh`` to set: .. code-block:: console @@ -299,18 +334,16 @@ rename the original files to remove the cycle or modify the ``config.sh`` to set Default Initial and Lateral Boundary Conditions ----------------------------------------------- The default initial and lateral boundary condition files are set to be a severe weather case -from 20190615 at 00 UTC. FV3GFS GRIB2 files are the default model and file format. A tar file -(``gst_model_data.tar.gz``) containing the model data for this case is available on EMC's FTP -data repository at https://ftp.emc.ncep.noaa.gov/EIB/UFS/SRW/v1p0/simple_test_case/. It is -also available on Amazon Web Services (AWS) at https://ufs-data.s3.amazonaws.com/public_release/ufs-srweather-app-v1.0.0/ic/gst_model_data.tar.gz. +from 20190615 at 18 UTC. FV3GFS GRIB2 files are the default model and file format. A tar file +(``gst_model_data.tar.gz``) containing the model data for this case is available in the `UFS SRW App Data Bucket `__. + +.. + COMMENT: Update tar file name later. Running the App for Different Dates ----------------------------------- -If users want to run the SRW Application for dates other than 06-15-2019, you will need to -make a change in the case to specify the desired data. This is done by modifying the -``config.sh`` ``DATE_FIRST_CYCL``, ``DATE_LAST_CYCL``, and ``CYCL_HRS`` settings. The -forecast length can be modified by changing the ``FCST_LEN_HRS``. In addition, the lateral -boundary interval can be specified using the ``LBC_SPEC_INTVL_HRS`` variable. +If users want to run the SRW Application for dates other than June 15-16, 2019, they will need to +make modify the ``config.sh`` settings, including the ``DATE_FIRST_CYCL``, ``DATE_LAST_CYCL``, and ``CYCL_HRS`` variables. The forecast length can be modified by changing the ``FCST_LEN_HRS``. In addition, the lateral boundary interval can be specified using the ``LBC_SPEC_INTVL_HRS`` variable. Users will need to ensure that the initial and lateral boundary condition files are available in the specified path for their new date, cycle, and forecast length. @@ -332,12 +365,6 @@ NOMADS: https://nomads.ncep.noaa.gov/pub/data/nccf/com/{model}/prod, where model * HRRR - available for the last 2 days https://nomads.ncep.noaa.gov/pub/data/nccf/com/hrrr/prod/ -NCDC archive: - -* GFS: https://www.ncdc.noaa.gov/data-access/model-data/model-datasets/global-forcast-system-gfs -* NAM: https://www.ncdc.noaa.gov/data-access/model-data/model-datasets/north-american-mesoscale-forecast-system-nam -* RAP: https://www.ncdc.noaa.gov/data-access/model-data/model-datasets/rapid-refresh-rap - AWS S3: * GFS: https://registry.opendata.aws/noaa-gfs-bdp-pds/ @@ -347,6 +374,10 @@ Google Cloud: * HRRR: https://console.cloud.google.com/marketplace/product/noaa-public/hrrr +FTP Data Repository: (data for SRW Release 1.0.0 & 1.0.1) +* https://ftp.emc.ncep.noaa.gov/EIB/UFS/SRW/v1p0/fix/ +* https://ftp.emc.ncep.noaa.gov/EIB/UFS/SRW/v1p0/simple_test_case/ + Others: * Univ. of Utah HRRR archive: http://home.chpc.utah.edu/~u0553130/Brian_Blaylock/cgi-bin/hrrr_download.cgi @@ -379,7 +410,7 @@ are initializing from NEMSIO format FV3GFS files. Best Practices for Conserving Disk Space and Keeping Files Safe --------------------------------------------------------------- Initial and lateral boundary condition files are large and can occupy a significant amount of -disk space. If various users will employ a common file system to conduct runs, it is recommended +disk space. If several users will employ a common file system to run forecasts, it is recommended that the users share the same ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` directories. That way, if raw model input files are already on disk for a given date they do not need to be replicated. diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index b151fcc481..603236cce3 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -38,6 +38,113 @@ File paths or code that include angle brackets (e.g., ``build__` below or refer to :numref:`Chapter %s ` for a more in-depth treatment. +.. _SRWPrerequisites: + +Prerequisites for Using the SRW Application +=============================================== + +Background Knowledge Prerequisites +-------------------------------------- + +The instructions in this documentation assume that users have certain background knowledge: + +* Familiarity with LINUX/UNIX systems +* Command line basics +* System configuration knowledge (e.g., compilers, environment variables, paths, etc.) +* Numerical Weather Prediction +* Meteorology (particularly meteorology at the scales being predicted) + +.. + COMMENT: Suggested sub-bullets for Meteorology/NWP? Cumulus and microphysics parameterizations? Convection? Microphysics? + +Additional background knowledge in the following areas could be helpful: +* High-Performance Computing (HPC) Systems for those running the SRW App on an HPC system +* Programming (particularly Python) for those interested in contributing to the SRW App code +* Creating an SSH Tunnel to access HPC systems from the command line +* Containerization +* Workflow Managers/Rocoto + + +Software/Operating System Requirements +----------------------------------------- +The UFS SRW Application has been designed so that any sufficiently up-to-date machine with a UNIX-based operating system should be capable of running the application. NOAA `Level 1 & 2 systems `__ already have these prerequisites installed. However, users working on other systems must ensure that the following requirements are installed on their system: + +**Minimum Platform Requirements:** + +* POSIX-compliant UNIX-style operating system + +* >40 GB disk space + + * 18 GB input data from GFS, RAP, and HRRR for "out-of-the-box" SRW App case described in :numref:`Chapter %s ` + * 6 GB for :term:`HPC-Stack` full installation + * 1 GB for ufs-srweather-app installation + * 11 GB for 48hr forecast on CONUS 25km domain + +* 4GB memory (CONUS 25km domain) + +* Fortran compiler with full Fortran 2008 standard support +* Fortran compiler with support for Fortran 2003 + + * gfortran v9+ or ifort v18+ are the only ones tested, but others may work. + +.. + COMMENT: Which one is it?! + +* C and C++ compilers compatible with the Fortran compiler + + * gcc v9+, ifort v18+, and clang v9+ (macOS, native Apple clang or LLVM clang) have been tested + +* Python v3.6+, including prerequisite packages ``jinja2``, ``pyyaml`` and ``f90nml`` + + * Python packages ``scipy``, ``matplotlib``, ``pygrib``, ``cartopy``, and ``pillow`` are required for users who would like to use the provided graphics scripts + +* Perl 5 + +* git v1.8+ + +.. + COMMENT: Should curl/wget/TIFF library also be required? These are listed as prerequisites for building HPC-Stack on generic MacOS/Linux + + +The following software is also required to run the SRW Application, but the :term:`HPC-Stack` (which contains the software libraries necessary for building and running the SRW App) can be configured to build these requirements: + +* CMake v3.15+ + +* MPI (MPICH, OpenMPI, or other implementation) + + * Only **MPICH** can be built with HPC-Stack. Other options must be installed separately by the user. + +* Software libraries + + * netCDF (C and Fortran libraries) + * HDF5 + * ESMF 8.2.0 + * Jasper + * libJPG + * libPNG + * zlib + +For MacOS systems, some additional software is needed. It is recommended that users install this software using the `Homebrew `__ package manager for MacOS: + +* brew install gcc@11 +* brew install cmake +* brew install make +* brew install wget +* brew install coreutils +* brew install pkg-config + +.. + COMMENT: Is this still accurate? It seems like we should delete the last one. And gcc@11 is basically the same as requiring fortran/C/C++ compilers, no? CMake is listed above. + +Optional but recommended prerequisites for all systems: + +* Conda for installing/managing Python packages +* Bash v4+ +* Rocoto Workflow Management System (1.3.1) +* Python packages ``scipy``, ``matplotlib``, ``pygrib``, ``cartopy``, and ``pillow`` for graphics +* Lmod + + .. _ComponentsOverview: SRW App Components Overview diff --git a/docs/UsersGuide/source/LAMGrids.rst b/docs/UsersGuide/source/LAMGrids.rst index 84d7cd84f2..1d4070f751 100644 --- a/docs/UsersGuide/source/LAMGrids.rst +++ b/docs/UsersGuide/source/LAMGrids.rst @@ -1,59 +1,92 @@ .. _LAMGrids: -======================================================================== -Limited Area Model (LAM) Grids: Predefined and User-Generated Options -======================================================================== -In order to set up the workflow and experiment generation of the UFS SRW App, the user -must choose between three predefined FV3-LAM grids or generate a user-defined grid. -At this time, full support will only be provided to those using one of the three predefined +================================================================================= +Limited Area Model (:term:`LAM`) Grids: Predefined and User-Generated Options +================================================================================= +In order to set up the workflow and generate an experiment with the UFS SRW App, the user +must choose between four predefined :term:`FV3`-:term:`LAM` grids or generate a user-defined grid. +At this time, full support will only be provided to those using one of the four predefined grids supported in this release. However, preliminary information is provided at the end of this chapter that describes how users can leverage the SRW App workflow scripts to generate -their own user-defined grid. This feature is not fully supported at this time and is -"use at your own risk". +their own user-defined grid. Currently, this feature is not fully supported and is +"use at your own risk." Predefined Grids -================ -The UFS SRW App release includes three predefined LAM grids that users can choose from -prior to generating a workflow/experiment configuration. To select a predefined grid, +================= +The UFS SRW App release includes four predefined limited area model (:term:`LAM`) grids to choose between +prior to configuring an experiment and generating a workflow. To select a predefined grid, the ``PREDEF_GRID_NAME`` variable within the ``config.sh`` script needs to be set to one -of the following three options: +of the following four options: * ``RRFS_CONUS_3km`` +* ``SUBCONUS_Ind_3km`` * ``RRFS_CONUS_13km`` * ``RRFS_CONUS_25km`` +These four options are provided for flexibility related to compute resources and supported physics options. The high-resolution 3-km :term:`CONUS` grid generally requires more compute power and works well with three of the four supported physics suites (see :numref:`Table %s `). Low-resolution grids (i.e., 13-km and 25-km domains) require less compute power and should generally be used with the fourth supported physics suite: ``FV3_GFS_v16``. + +.. + COMMENT: FV3_WoFS can run on a 13-km and 25-km grid according to Yunheng. Can the HRRR and RRFS_v1beta also run on the 13-km and 25-km grids? Why is FV3_GFS_v16 preferred? Because it has cumulus physics? + +.. _GridPhysicsCombos: + +.. table:: Preferred grid and physics combinations for supported domains & physics suites + + +-------------------+------------------+ + | Grid | Physics Suite(s) | + +===================+==================+ + | RRFS_CONUS_3km | FV3_RRFS_v1beta | + | | | + | | FV3_HRRR | + | | | + | | FV3_WoFS | + +-------------------+------------------+ + | SUBCONUS_Ind_3km | FV3_RRFS_v1beta | + | | | + | | FV3_HRRR | + | | | + | | FV3_WoFS | + +-------------------+------------------+ + | RRFS_CONUS_13km | FV3_GFS_v16 | + +-------------------+------------------+ + | RRFS_CONUS_25km | FV3_GFS_v16 | + +-------------------+------------------+ + +In theory, any of the supported physics suites could be used with any of the predefined grids, but the results will be more accurate and meaningful with appropriate grid/physics pairings. + +The predefined :term:`CONUS` grids follow the naming convention (e.g., RRFS_CONUS_*km) of the prototype 3-km continental United States (CONUS) grid being tested for the Rapid Refresh Forecast System (RRFS). The RRFS will be a convection-allowing, hourly-cycled, :term:`FV3`-:term:`LAM`-based ensemble planned for operational implementation in 2024. All four supported grids were created to fit completely within the High Resolution Rapid Refresh (`HRRR `_) domain to allow for use of HRRR data to initialize the SRW App. + +Predefined 3-km CONUS Grid +----------------------------- + +The 3km CONUS domain is ideal for running the ``FV3_RRFS_v1beta`` physics suite, since this suite definition file (:term:`SDF`) was specifically created for convection-allowing scales and is the precursor to the operational physics suite that will be used in the RRFS. The 3-km domain can also be used with the ``FV3_HRRR`` and ``FV3_WoFS`` physics suites, which likewise do not include convective parameterization. In fact, the ``FV3_WoFS`` physics suite is configured to run at 3-km *or less* and could therefore run with even higher-resolution user-defined domains if desired. However, the ``FV3_GFS_v16`` suite generally should *not* be used with the 3-km domain because the cumulus physics used in that physics suite is not configured to run at the 3-km resolution. + .. _RRFS_CONUS_3km: .. figure:: _static/RRFS_CONUS_3km.sphr.native_wrtcmp.png *The boundary of the RRFS_CONUS_3km computational grid (red) and corresponding write-component grid (blue).* -The predefined grids are named after the prototype 3-km continental United States (CONUS) grid being -tested for the Rapid Refresh Forecast System (RRFS), which will be a convection-allowing, -hourly-cycled, FV3-LAM-based ensemble planned for operational implementation in 2024. To allow -for use of High Resolution Rapid Refresh (`HRRR `_) data to -initialize the SRW App, all three supported grids were created to fit completely within the HRRR domain. -Three resolution options were provided for flexibility related to compute resources -and physics options. For example, a user may wish to use the 13-km or 25-km domain when running -with the ``FV3_GFS_v16`` suite definition file (SDF), since that SDF uses cumulus physics that are -not configured to run at the 3-km resolution. In addition, users will have fewer computational -constraints when running with the 13-km and 25-km domains. - -The boundary of the ``RRFS_CONUS_3km`` domain is shown in :numref:`Figure %s ` (in red). -Note that while it is possible to initialize the FV3-LAM with coarser external model data when using the -``RRFS_CONUS_3km`` domain, it is generally advised to use external model data that has a resolution -similar to that of the native FV3-LAM (predefined) grid. In addition, this grid is ideal for running the -``FV3_RRFS_v1beta`` suite definition file (SDF), since this SDF was specifically created for convection-allowing scales and is the -precursor to the operational physics suite that will be used in the RRFS. - -As can be seen in :numref:`Figure %s `, the boundary of the write-component grid (in blue) sits -just inside the computational domain (in red). This extra grid is required because the post-processing -utility (UPP) is currently unable to process data on the native FV3 gnomonic grid (in red). Therefore, -model data are interpolated to a Lambert conformal grid (the write component grid) in order for UPP to -read in and correctly process the data. - -The ``RRFS_CONUS_13km`` grid (:numref:`Fig. %s `) also covers the full CONUS. This grid is meant to -be run with the ``FV3_GFS_v16`` SDF. +The boundary of the ``RRFS_CONUS_3km`` domain is shown in :numref:`Figure %s ` (in red), and the boundary of the :ref:`write-component grid ` sits just inside the computational domain (in blue). This extra grid is required because the post-processing utility (:term:`UPP`) is currently unable to process data on the native FV3 gnomonic grid (in red). Therefore, model data are interpolated to a Lambert conformal grid (the write component grid) in order for UPP to read in and correctly process the data. + +.. note:: + While it is possible to initialize the FV3-LAM with coarser external model data when using the ``RRFS_CONUS_3km`` domain, it is generally advised to use external model data that has a resolution similar to that of the native FV3-LAM (predefined) grid. + + +Predefined SUBCONUS Grid Over Indianapolis +-------------------------------------------- + +.. _SUBCONUS_Ind_3km: + +.. figure:: _static/SUBCONUS_Ind_3km.png + :alt: Map of Indiana and portions of the surrounding states. The map shows the boundaries of the continental United States sub-grid centered over Indianapolis. The computational grid boundaries appear in red and the write-component grid appears just inside it in blue. + + *The boundary of the SUBCONUS_Ind_3km computational grid (red) and corresponding write-component grid (blue).* + +The ``SUBCONUS_Ind_3km`` grid covers only a small section of the :term:`CONUS` over Indianapolis. Like the ``RRFS_CONUS_3km`` grid, it is ideally paired with the ``FV3_RRFS_v1beta``, ``FV3_HRRR``, or ``FV3_WoFS`` physics suites, since these are all convection-allowing physics suites designed to work well on high-resolution grids. + +Predefined 13-km Grid +------------------------ .. _RRFS_CONUS_13km: @@ -61,9 +94,10 @@ be run with the ``FV3_GFS_v16`` SDF. *The boundary of the RRFS_CONUS_13km computational grid (red) and corresponding write-component grid (blue).* -The final predefined CONUS grid (:numref:`Fig. %s `) uses a 25-km resolution and -is meant mostly for quick testing to ensure functionality prior to using a higher-resolution domain. -However, for users who would like to use this domain for research, the ``FV3_GFS_v16`` SDF is recommended. +The ``RRFS_CONUS_13km`` grid (:numref:`Fig. %s `) covers the full :term:`CONUS`. This grid is meant to be run with the ``FV3_GFS_v16`` physics suite. The ``FV3_GFS_v16`` physics suite uses convective :term:`parameterizations`, whereas the other supported suites do not. Convective parameterizations are necessary for low-resolution grids because convection occurs on scales smaller than 25km and 13km. + +Predefined 25-km Grid +------------------------ .. _RRFS_CONUS_25km: @@ -71,45 +105,51 @@ However, for users who would like to use this domain for research, the ``FV3_GFS *The boundary of the RRFS_CONUS_25km computational grid (red) and corresponding write-component grid (blue).* +The final predefined :term:`CONUS` grid (:numref:`Fig. %s `) uses a 25-km resolution and +is meant mostly for quick testing to ensure functionality prior to using a higher-resolution domain. +However, for users who would like to use the 25-km domain for research, the ``FV3_GFS_v16`` :term:`SDF` is recommended for the reasons mentioned :ref:`above `. + +Ultimately, the choice of grid is experiment-dependent and resource-dependent. For example, a user may wish to use the ``FV3_GFS_v16`` physics suite, which uses cumulus physics that are not configured to run at the 3-km resolution. In this case, the 13-km or 25-km domain options are better suited to the experiment. Users will also have fewer computational constraints when running with the 13-km and 25-km domains, so depending on the resources available to them, certain grids may be better options than others. + +.. _UserDefinedGrid: + Creating User-Generated Grids -============================= -While the three predefined grids available in this release are ideal for users just starting +=============================== +While the four predefined grids available in this release are ideal for users just starting out with the SRW App, more advanced users may wish to create their own grid for testing over -a different region and/or with a different resolution. Creating a user-defined grid requires -knowledge of how the SRW App workflow functions, in particular, understanding the set of -scripts that handle the workflow and experiment generation. It is also important to note that -user-defined grids are not a supported feature of the current release; however information is -being provided for the benefit of the FV3-LAM community. +a different region and/or with a different resolution. Creating a user-defined grid requires +knowledge of how the SRW App workflow functions. In particular, it is important to understand the set of +scripts that handle the workflow and experiment generation (see :numref:`Figure %s ` and :numref:`Figure %s `). It is also important to note that user-defined grids are not a supported feature of the current release; however, information is being provided for the benefit of the FV3-LAM community. With those caveats in mind, this section provides instructions for adding a new grid to the FV3-LAM workflow that will be generated using the "ESGgrid" method (i.e., using the regional_esg_grid code -in the UFS_UTILS repository, where ESG stands for "Extended Schmidt Gnomonic"). We assume here -that the grid to be generated covers a domain that (1) does not contain either of the poles and -(2) does not cross the -180 deg --> +180 deg discontinuity in longitude near the international -date line. Instructions for domains that do not have these restrictions will be provided in a future release. +in the `UFS_UTILS `__ repository, where ESG stands for "Extended Schmidt Gnomonic"). We assume here that the grid to be generated covers a domain that (1) does not contain either of the poles and (2) does not cross the -180 deg --> +180 deg discontinuity in longitude near the international date line. Instructions for domains that do not have these restrictions will be provided in a future release. The steps to add such a grid to the workflow are as follows: -#. Decide on the name of the grid. For the purposes of this documentation, the grid will be called "NEW_GRID". +#. Choose the name of the grid. For the purposes of this documentation, the grid will be called "NEW_GRID". #. Add NEW_GRID to the array ``valid_vals_PREDEF_GRID_NAME`` in the ``ufs-srweather-app/regional_workflow/ush/valid_param_vals.sh`` file. -#. In the file ``ufs-srweather-app/regional_workflow/ush/set_predef_grid_params.sh``, add a stanza to - the case statement ``case ${PREDEF_GRID_NAME} in`` for NEW_GRID. An example of such a stanza - is given below along with comments describing the variables that need to be set. +#. In ``ufs-srweather-app/regional_workflow/ush/set_predef_grid_params.sh``, add a stanza to + the case statement ``case ${PREDEF_GRID_NAME} in`` for NEW_GRID. An example of such a stanza + is given :ref:`below ` along with comments describing the variables that need to be set. -To run a forecast experiment on NEW_GRID, start with a workflow configuration file for a successful -experiment (this file is named ``config.sh`` and is located in the directory -``ufs-srweather-app/regional_workflow/ush``) and change the line for ``PREDEF_GRID_NAME`` to the following: +To run a forecast experiment on NEW_GRID, start with a workflow configuration file for a successful experiment (e.g., ``config.sh``, located in the ``ufs-srweather-app/regional_workflow/ush`` subdirectory), and change the line for ``PREDEF_GRID_NAME`` to the following: .. code-block:: console PREDEF_GRID_NAME="NEW_GRID" -Then, generate a new experiment/workflow using ``generate_FV3LAM_wflow.sh`` in the usual way. +Then, generate a new experiment/workflow using ``generate_FV3LAM_wflow.sh`` in the :ref:`usual way `. + +Code Example +--------------- + +The following is an example of a code stanza for "NEW_GRID" to be added to ``set_predef_grid_params.sh``: -The following is an example of a stanza for "NEW_GRID" to be added to ``set_predef_grid_params.sh``: +.. _NewGridExample: .. code-block:: console @@ -134,7 +174,7 @@ The following is an example of a stanza for "NEW_GRID" to be added to ``set_pred ESGgrid_LAT_CTR=38.5 # The grid cell sizes in the x and y directions, where x and y are the - # native coordinates of any ESG grid. The units of x and y are in + # native coordinates of any ESG grid. The units of x and y are in # meters. These should be set to the nominal resolution we want the # grid to have. The cells will have exactly these sizes in xy-space # (computational space) but will have varying size in physical space. @@ -152,9 +192,9 @@ The following is an example of a stanza for "NEW_GRID" to be added to ``set_pred # The width of the halo (in units of grid cells) that the temporary # wide-halo grid created during the grid generation task (make_grid) - # will have. This wide-halo grid gets "shaved" down to obtain the + # will have. This wide-halo grid gets "shaved" down to obtain the # 4-cell-wide halo and 3-cell-wide halo grids that the forecast model - # (as well as other codes) will actually use. Recall that the halo is + # (as well as other codes) will actually use. Recall that the halo is # needed to provide lateral boundary conditions to the forecast model. # Usually, there is no need to modify this parameter. @@ -164,9 +204,9 @@ The following is an example of a stanza for "NEW_GRID" to be added to ``set_pred # is the (inverse) frequency with which (most of) the physics suite is # called. The smaller the grid cell size is, the smaller this value # needs to be in order to avoid numerical instabilities during the - # forecast. The values specified below are used only if DT_ATMOS is + # forecast. The values specified below are used only if DT_ATMOS is # not explicitly set in the user-specified experiment configuration - # file config.sh. Note that this parameter may be suite dependent. + # file config.sh. Note that this parameter may be suite dependent. if [ "${CCPP_PHYS_SUITE}" = "FV3_GFS_v16" ]; then DT_ATMOS=${DT_ATMOS:-"300"} @@ -189,18 +229,18 @@ The following is an example of a stanza for "NEW_GRID" to be added to ``set_pred # interpolated. The output fields are not specified on the native grid # but are instead remapped to this write-component grid because the # post-processing software (UPP; called during the run_post tasks) is - # not able to process fields on the native grid. The variable + # not able to process fields on the native grid. The variable # "QUILTING", which specifies whether or not to use the # write-component grid, is by default set to "TRUE". if [ "$QUILTING" = "TRUE" ]; then # The number of "groups" of MPI tasks that may be running at any given - # time to write out the output. Each write group will be writing to + # time to write out the output. Each write group will be writing to # one set of output files (a dynf${fhr}.nc and a phyf${fhr}.nc file, - # where $fhr is the forecast hour). Each write group contains + # where $fhr is the forecast hour). Each write group contains # WRTCMP_write_tasks_per_group tasks. Usually, it is sufficient to - # have just one write group. This may need to be increased if the + # have just one write group. This may need to be increased if the # forecast is proceeding so quickly that a single write group cannot # complete writing to its set of files before there is a need/request # to start writing the next set of files at the next output time (this @@ -249,7 +289,7 @@ The following is an example of a stanza for "NEW_GRID" to be added to ``set_pred WRTCMP_lat_lwr_left="23.89394570" # The grid cell sizes along the x and y directions of the - # write-component grid. Units depend on the coordinate system used by + # write-component grid. Units depend on the coordinate system used by # the grid (i.e. the value of WRTCMP_output_grid). For a Lambert # conformal write-component grid, the units are in meters. diff --git a/docs/UsersGuide/source/Non-ContainerQS.rst b/docs/UsersGuide/source/Non-ContainerQS.rst new file mode 100644 index 0000000000..90bb7bda17 --- /dev/null +++ b/docs/UsersGuide/source/Non-ContainerQS.rst @@ -0,0 +1,123 @@ +.. _NCQuickstart: + +============================ +Non-Container Quick Start +============================ + +Install the HPC-Stack +=========================== +SRW App users who are not working on a `Level 1 `__ platform will need to install the :term:`HPC-Stack` prior to building the SRW App on a new machine. Installation instructions appear in both the `HPC-Stack documentation `__ and in :numref:`Chapter %s ` of this User's Guide. The steps will vary slightly depending on the user's platform. However, in all cases, the process involves cloning the `HPC-Stack repository `__, creating and entering a build directory, and invoking ``cmake`` and ``make`` to build the code. This process will create a number of modulefiles and scripts that will be used for setting up the build environment for the UFS SRW App. + +Once the HPC-Stack has been successfully installed, users can move on to building the UFS SRW Application. + +.. + COMMENT: Are these notes relevant now that NCEPLIBS/NCEPLIBS-external have been changed to HPC-Stack? + .. note:: + The ``ESMFMKFILE`` variable allows HPC-Stack to find the location where ESMF has been built; if users receive an ``ESMF not found, abort`` error, they may need to specify a slightly different location: + + .. code-block:: console + + export ESMFMKFILE=${INSTALL_PREFIX}/lib64/esmf.mk + + Then they can delete and re-create the build directory and continue the build process as described above. + + .. note:: + + If users skipped the building of any of the software provided by HPC-Stack, they may need to add the appropriate locations to their ``CMAKE_PREFIX_PATH`` variable. Multiple directories may be added, separated by semicolons (;) as in the following example: + + .. code-block:: console + + cmake -DCMAKE_INSTALL_PREFIX=${INSTALL_PREFIX} -DCMAKE_PREFIX_PATH=”${INSTALL_PREFIX};/location/of/other/software” -DOPENMP=ON .. 2>&1 | tee log.cmake + + +Building the UFS SRW Application +======================================= + +For a detailed explanation of how to build and run the SRW App on any supported system, see :numref:`Chapter %s `. The overall procedure for generating an experiment is shown in :numref:`Figure %s `, with the scripts to generate and run the workflow shown in red. An overview of the required steps appears below. However, users can expect to access other referenced sections of this User's Guide for more detail. + + #. Clone the SRW App from GitHub: + + .. code-block:: console + + git clone -b develop https://github.com/ufs-community/ufs-srweather-app.git + + #. Check out the external repositories: + + .. code-block:: console + + cd ufs-srweather-app + ./manage_externals/checkout_externals + + #. Set up the build environment and build the executables. + + * **Option 1:** + + .. code-block:: console + + ./devbuild.sh --platform= + + where is replaced with the name of the platform the user is working on. Valid values are: ``cheyenne`` | ``gaea`` | ``hera`` | ``jet`` | ``macos`` | ``odin`` | ``orion`` | ``singularity`` | ``wcoss_dell_p3`` + + * **Option 2:** + + .. code-block:: console + + source etc/lmod-setup.sh + + where refers to the user's platform (e.g., ``macos``, ``gaea``, ``odin``, ``singularity``). + + Users will also need to load the "build" modulefile appropriate to their system. On Level 3 & 4 systems, users can adapt an existing modulefile (such as ``build_macos_gnu``) to their system. + + .. code-block:: console + + module use + module load build__ + + From the top-level ``ufs-srweather-app`` directory, run: + + .. code-block:: console + + mkdir build + cd build + cmake .. -DCMAKE_INSTALL_PREFIX=.. + make -j 4 >& build.out & + + #. Download and stage data (both the fix files and the :term:`IC/LBC` files) according to the instructions in :numref:`Chapter %s ` (if on a Level 3-4 system). + + #. Configure the experiment parameters. + + .. code-block:: console + + cd regional_workflow/ush + cp config.community.sh config.sh + + Users will need to adjust the experiment parameters in the ``config.sh`` file to suit the needs of their experiment (e.g., date, time, grid, physics suite, etc.). More detailed guidance is available in :numref:`Chapter %s `. Parameters and valid values are listed in :numref:`Chapter %s `. + + #. Load the python environment for the regional workflow. Users on Level 3-4 systems will need to use one of the existing ``wflow_`` modulefiles (e.g., ``wflow_macos``) and adapt it to their system. + + .. code-block:: console + + module use + module load wflow_ + conda activate regional_workflow + + #. Generate the experiment workflow. + + .. code-block:: console + + ./generate_FV3LAM_wflow.sh + + #. Run the regional workflow. There are several methods available for this step, which are discussed in :numref:`Chapter %s ` and :numref:`Chapter %s `. One possible method is summarized below. It requires the Rocoto Workflow Manager. + + .. code-block:: console + + cd $EXPTDIR + ./launch_FV3LAM_wflow.sh + + To launch the workflow and check the experiment's progress: + + .. code-block:: console + + ./launch_FV3LAM_wflow.sh; tail -n 40 log.launch_FV3LAM_wflow + +Optionally, users may :ref:`configure their own grid `, instead of using a predefined grid, and :ref:`plot the output ` of their experiment(s). diff --git a/docs/UsersGuide/source/RocotoInfo.rst b/docs/UsersGuide/source/RocotoInfo.rst index 03d13571cb..f2ed1836ef 100644 --- a/docs/UsersGuide/source/RocotoInfo.rst +++ b/docs/UsersGuide/source/RocotoInfo.rst @@ -4,39 +4,36 @@ Additional Rocoto Information ============================= The tasks in the SRW Application (:numref:`Table %s `) are typically run using -the Rocoto Workflow Manager. Rocoto is a Ruby program that interfaces with the batch system on an +the Rocoto Workflow Manager. Rocoto is a Ruby program that communicates with the batch system on an HPC system to run and manage dependencies between the tasks. Rocoto submits jobs to the HPC batch -system as the task dependencies allow, and runs one instance of the workflow for a set of user-defined -cycles. More information on Rocoto can be found at https://github.com/christopherwharrop/rocoto/wiki/documentation. +system as the task dependencies allow and runs one instance of the workflow for a set of user-defined +cycles. More information about Rocoto can be found on the `Rocoto Wiki `__. The SRW App workflow is defined in a Jinja-enabled Rocoto XML template called ``FV3LAM_wflow.xml``, -which resides in the ``regional_workflow/ufs/templates`` directory. When the ``generate_FV3LAM_wflow.sh`` +which resides in the ``regional_workflow/ush/templates`` directory. When the ``generate_FV3LAM_wflow.sh`` script is run, the ``fill_jinja_template.py`` script is called, and the parameters in the template file are filled in. The completed file contains the workflow task names, parameters needed by the job scheduler, -and task interdependencies. The generated XML file is then copied to the experiment directory: +and task interdependencies. The generated XML file is then copied to the experiment directory: ``$EXPTDIR/FV3LAM_wflow.xml``. -There are a number of Rocoto commands available to run and monitor the workflow and can be found in the -complete `Rocoto documentation `_. +There are a number of Rocoto commands available to run and monitor the workflow; users can find more information in the +complete `Rocoto documentation `__. Descriptions and examples of commonly used commands are discussed below. rocotorun -========= +========== The ``rocotorun`` command is used to run the workflow by submitting tasks to the batch system. It will -automatically resubmit failed tasks and can recover from system outages without user intervention. -An example is: +automatically resubmit failed tasks and can recover from system outages without user intervention. The command takes the following format: .. code-block:: console - rocotorun -w /path/to/workflow/xml/file -d /path/to/workflow/database/file -v 10 + rocotorun -w -d -v 10 where * ``-w`` specifies the name of the workflow definition file. This must be an XML file. -* ``-d`` specifies the name of the database file that is to be used to store the state of the workflow. - The database file is a binary file created and used only by Rocoto and need not exist prior to the first - time the command is run. -* ``-v`` (optional) specified level of verbosity. If no level is specified, a level of 1 is used. +* ``-d`` specifies the name of the database file that stores the state of the workflow. The database file is a binary file created and used only by Rocoto. It need not exist prior to the first time the command is run. +* ``-v`` (optional) specified level of verbosity. If no level is specified, a level of 1 is used. From the ``$EXPTDIR`` directory, the ``rocotorun`` command for the workflow would be: @@ -45,58 +42,58 @@ From the ``$EXPTDIR`` directory, the ``rocotorun`` command for the workflow woul rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db It is important to note that the ``rocotorun`` process is iterative; the command must be executed -many times before the entire workflow is completed, usually every 2-10 minutes. This command can be -placed in the user’s crontab and cron will call it with a specified frequency. More information on -this command can be found at https://github.com/christopherwharrop/rocoto/wiki/documentation. +many times before the entire workflow is completed, usually every 1-10 minutes. This command can be +placed in the user’s crontab, and cron will call it with a specified frequency. More information on +this command can be found in the `Rocoto documentation `__. The first time the ``rocotorun`` command is executed for a workflow, the files ``FV3LAM_wflow.db`` and ``FV3LAM_wflow_lock.db`` are created. There is usually no need for the user to modify these files. Each time this command is executed, the last known state of the workflow is read from the ``FV3LAM_wflow.db`` file, the batch system is queried, jobs are submitted for tasks whose dependencies have been satisfied, -and the current state of the workflow is saved in ``FV3LAM_wflow.db``. If there is a need to relaunch -the workflow from scratch, both database files can be deleted, and the workflow can be run using ``rocotorun`` -or the launch script ``launch_FV3LAM_wflow.sh`` (executed multiple times as described above). +and the current state of the workflow is saved in ``FV3LAM_wflow.db``. If there is a need to relaunch +the workflow from scratch, both database files can be deleted, and the workflow can be run by executing the ``rocotorun`` command +or the launch script (``launch_FV3LAM_wflow.sh``) multiple times. rocotostat -========== +=========== ``rocotostat`` is a tool for querying the status of tasks in an active Rocoto workflow. Once the workflow has been started with the ``rocotorun`` command, Rocoto can also check the status of the workflow using the ``rocotostat`` command: .. code-block:: console - rocotostat -w /path/to/workflow/xml/file -d /path/to/workflow/database/file + rocotostat -w -d Executing this command will generate a workflow status table similar to the following: .. code-block:: console - CYCLE TASK JOBID STATE EXIT STATUS TRIES DURATION - ============================================================================================================================= - 201907010000 make_grid 175805 QUEUED - 0 0.0 - 201907010000 make_orog - - - - - - 201907010000 make_sfc_climo - - - - - - 201907010000 get_extrn_ics druby://hfe01:36261 SUBMITTING - 0 0.0 - 201907010000 get_extrn_lbcs druby://hfe01:36261 SUBMITTING - 0 0.0 - 201907010000 make_ics - - - - - - 201907010000 make_lbcs - - - - - - 201907010000 run_fcst - - - - - - 201907010000 run_post_f000 - - - - - - 201907010000 run_post_f001 - - - - - - 201907010000 run_post_f002 - - - - - - 201907010000 run_post_f003 - - - - - - 201907010000 run_post_f004 - - - - - - 201907010000 run_post_f005 - - - - - - 201907010000 run_post_f006 - - - - - + CYCLE TASK JOBID STATE EXIT STATUS TRIES DURATION + ============================================================================================================= + 201907010000 make_grid 175805 QUEUED - 0 0.0 + 201907010000 make_orog - - - - - + 201907010000 make_sfc_climo - - - - - + 201907010000 get_extrn_ics druby://hfe01:36261 SUBMITTING - 0 0.0 + 201907010000 get_extrn_lbcs druby://hfe01:36261 SUBMITTING - 0 0.0 + 201907010000 make_ics - - - - - + 201907010000 make_lbcs - - - - - + 201907010000 run_fcst - - - - - + 201907010000 run_post_f000 - - - - - + 201907010000 run_post_f001 - - - - - + 201907010000 run_post_f002 - - - - - + 201907010000 run_post_f003 - - - - - + 201907010000 run_post_f004 - - - - - + 201907010000 run_post_f005 - - - - - + 201907010000 run_post_f006 - - - - - This table indicates that the ``make_grid`` task was sent to the batch system and is now queued, while the ``get_extrn_ics`` and ``get_extrn_lbcs`` tasks for the ``201907010000`` cycle are in the process of being submitted to the batch system. Note that issuing a ``rocotostat`` command without an intervening ``rocotorun`` command will not result in an -updated workflow status table; it will print out the same table. It is the ``rocotorun`` command that updates -the workflow database file (in this case ``FV3LAM_wflow.db``, located in ``$EXPTDIR``); the ``rocotostat`` command -reads the database file and prints the table to the screen. To see an updated table, the ``rocotorun`` command +updated workflow status table; it will print out the same table. It is the ``rocotorun`` command that updates +the workflow database file (in this case ``FV3LAM_wflow.db``, located in ``$EXPTDIR``). The ``rocotostat`` command +reads the database file and prints the table to the screen. To see an updated table, the ``rocotorun`` command must be executed followed by the ``rocotostat`` command. After issuing the ``rocotorun`` command several times (over the course of several minutes or longer, depending @@ -104,45 +101,47 @@ on your grid size and computational resources), the output of the ``rocotostat`` .. code-block:: console - CYCLE TASK JOBID STATE EXIT STATUS TRIES DURATION - ============================================================================================================================ - 201907010000 make_grid 175805 SUCCEEDED 0 1 10.0 - 201907010000 make_orog 175810 SUCCEEDED 0 1 27.0 - 201907010000 make_sfc_climo 175822 SUCCEEDED 0 1 38.0 - 201907010000 get_extrn_ics 175806 SUCCEEDED 0 1 37.0 - 201907010000 get_extrn_lbcs 175807 SUCCEEDED 0 1 53.0 - 201907010000 make_ics 175825 SUCCEEDED 0 1 99.0 - 201907010000 make_lbcs 175826 SUCCEEDED 0 1 90.0 - 201907010000 run_fcst 175937 RUNNING - 0 0.0 - 201907010000 run_post_f000 - - - - - - 201907010000 run_post_f001 - - - - - - 201907010000 run_post_f002 - - - - - - 201907010000 run_post_f003 - - - - - - 201907010000 run_post_f004 - - - - - - 201907010000 run_post_f005 - - - - - - 201907010000 run_post_f006 - - - - - - -When the workflow runs to completion, all tasks will be marked as SUCCEEDED. The log files from the tasks + CYCLE TASK JOBID STATE EXIT STATUS TRIES DURATION + ==================================================================================================== + 201907010000 make_grid 175805 SUCCEEDED 0 1 10.0 + 201907010000 make_orog 175810 SUCCEEDED 0 1 27.0 + 201907010000 make_sfc_climo 175822 SUCCEEDED 0 1 38.0 + 201907010000 get_extrn_ics 175806 SUCCEEDED 0 1 37.0 + 201907010000 get_extrn_lbcs 175807 SUCCEEDED 0 1 53.0 + 201907010000 make_ics 175825 SUCCEEDED 0 1 99.0 + 201907010000 make_lbcs 175826 SUCCEEDED 0 1 90.0 + 201907010000 run_fcst 175937 RUNNING - 0 0.0 + 201907010000 run_post_f000 - - - - - + 201907010000 run_post_f001 - - - - - + 201907010000 run_post_f002 - - - - - + 201907010000 run_post_f003 - - - - - + 201907010000 run_post_f004 - - - - - + 201907010000 run_post_f005 - - - - - + 201907010000 run_post_f006 - - - - - + +When the workflow runs to completion, all tasks will be marked as SUCCEEDED. The log files from the tasks are located in ``$EXPTDIR/log``. If any tasks fail, the corresponding log file can be checked for error -messages. Optional arguments for the ``rocotostat`` command can be found at https://github.com/christopherwharrop/rocoto/wiki/documentation. +messages. Optional arguments for the ``rocotostat`` command can be found in the `Rocoto documentation `__. .. _rocotocheck: rocotocheck -=========== -Sometimes, issuing a ``rocotorun`` command will not cause the next task to launch. ``rocotocheck`` is a -tool that can be used to query detailed information about a task or cycle in the Rocoto workflow. To -determine the cause of a particular task not being submitted, the ``rocotocheck`` command can be used +============ +Sometimes, issuing a ``rocotorun`` command will not cause the next task to launch. ``rocotocheck`` is a +tool that can be used to query detailed information about a task or cycle in the Rocoto workflow. To +determine why a particular task has not been submitted, the ``rocotocheck`` command can be used from the ``$EXPTDIR`` directory as follows: .. code-block:: console - rocotocheck -w /path/to/workflow/xml/file -d /path/to/workflow/database/ file -c YYYYMMDDHHMM -t taskname + rocotocheck -w -d file -c -t where -* ``-c`` is the cycle to query -* ``-t`` is the task name +* ``-c`` is the cycle to query in YYYYMMDDHHmm format +* ``-t`` is the task name (see default task names in :numref:`Chapter %s `) + +The cycle and task names appear in the first and second columns of the table output by ``rocotostat``. A specific example is: @@ -150,71 +149,71 @@ A specific example is: rocotocheck -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 -c 201907010000 -t run_fcst -This will result in output similar to the following: +Running ``rocotocheck`` will result in output similar to the following: .. code-block:: console :emphasize-lines: 8,19,34 Task: run_fcst - account: gsd-fv3 - command: /scratch2/BMC/det/$USER/ufs-srweather-app/regional_workflow/ush/load_modules_run_task.sh "run_fcst" "/scratch2/BMC/det/$USER/ufs-srweather-app/regional_workflow/jobs/JREGIONAL_RUN_FCST" - cores: 24 - final: false - jobname: run_FV3 - join: /scratch2/BMC/det/$USER/expt_dirs/test_community/log/run_fcst_2019070100.log - maxtries: 3 - name: run_fcst - nodes: 1:ppn=24 - queue: batch - throttle: 9999999 - walltime: 04:30:00 - environment - CDATE ==> 2019070100 - CYCLE_DIR ==> /scratch2/BMC/det/$USER/UFS_CAM/expt_dirs/test_community/2019070100 - PDY ==> 20190701 - SCRIPT_VAR_DEFNS_FP ==> /scratch2/BMC/det/$USER/expt_dirs/test_community/var_defns.sh - dependencies - AND is satisfied - make_ICS_surf_LBC0 of cycle 201907010000 is SUCCEEDED - make_LBC1_to_LBCN of cycle 201907010000 is SUCCEEDED + account: gsd-fv3 + command: /scratch2/BMC/det/$USER/ufs-srweather-app/regional_workflow/ush/load_modules_run_task.sh "run_fcst" "/scratch2/BMC/det/$USER/ufs-srweather-app/regional_workflow/jobs/JREGIONAL_RUN_FCST" + cores: 24 + final: false + jobname: run_FV3 + join: /scratch2/BMC/det/$USER/expt_dirs/test_community/log/run_fcst_2019070100.log + maxtries: 3 + name: run_fcst + nodes: 1:ppn=24 + queue: batch + throttle: 9999999 + walltime: 04:30:00 + environment + CDATE ==> 2019070100 + CYCLE_DIR ==> /scratch2/BMC/det/$USER/UFS_CAM/expt_dirs/test_community/2019070100 + PDY ==> 20190701 + SCRIPT_VAR_DEFNS_FP ==> /scratch2/BMC/det/$USER/expt_dirs/test_community/var_defns.sh + dependencies + AND is satisfied + make_ICS_surf_LBC0 of cycle 201907010000 is SUCCEEDED + make_LBC1_to_LBCN of cycle 201907010000 is SUCCEEDED Cycle: 201907010000 - Valid for this task: YES - State: active - Activated: 2019-10-29 18:13:10 UTC - Completed: - - Expired: - + Valid for this task: YES + State: active + Activated: 2019-10-29 18:13:10 UTC + Completed: - + Expired: - Job: 513615 - State: DEAD (FAILED) - Exit Status: 1 - Tries: 3 - Unknown count: 0 - Duration: 58.0 + State: DEAD (FAILED) + Exit Status: 1 + Tries: 3 + Unknown count: 0 + Duration: 58.0 -This shows that although all dependencies for this task are satisfied (see the dependencies section, highlighted above), +This output shows that although all dependencies for this task are satisfied (see the dependencies section, highlighted above), it cannot run because its ``maxtries`` value (highlighted) is 3. Rocoto will attempt to launch it at most 3 times, -and it has already been tried 3 times (the ``Tries`` value, also highlighted). +and it has already been tried 3 times (note the ``Tries`` value, also highlighted). The output of the ``rocotocheck`` command is often useful in determining whether the dependencies for a given task -have been met. If not, the dependencies section in the output of ``rocotocheck`` will indicate this by stating that a +have been met. If not, the dependencies section in the output of ``rocotocheck`` will indicate this by stating that a dependency "is NOT satisfied". rocotorewind -============ -``rocotorewind`` is a tool that attempts to undo the effects of running a task and is commonly used to rerun part -of a workflow that has failed. If a task fails to run (the STATE is DEAD), and needs to be restarted, the ``rocotorewind`` -command will rerun tasks in the workflow. The command line options are the same as those described in the ``rocotocheck`` -:numref:`section %s `, and the general usage statement looks like: +============= +``rocotorewind`` is a tool that attempts to undo the effects of running a task. It is commonly used to rerun part +of a workflow that has failed. If a task fails to run (the STATE is DEAD) and needs to be restarted, the ``rocotorewind`` +command will rerun tasks in the workflow. The command line options are the same as those described for ``rocotocheck`` +(in :numref:`section %s `), and the general usage statement looks like the following: .. code-block:: console - rocotorewind -w /path/to/workflow/xml/file -d /path/to/workflow/database/ file -c YYYYMMDDHHMM -t taskname + rocotorewind -w -d file -c -t Running this command will edit the Rocoto database file ``FV3LAM_wflow.db`` to remove evidence that the job has been run. ``rocotorewind`` is recommended over ``rocotoboot`` for restarting a task, since ``rocotoboot`` will force a specific -task to run, ignoring all dependencies and throttle limits. The throttle limit, denoted by the variable cyclethrottle -in the ``FV3LAM_wflow.xml`` file, limits how many cycles can be active at one time. An example of how to use this +task to run, ignoring all dependencies and throttle limits. The throttle limit, denoted by the variable cyclethrottle +in the ``FV3LAM_wflow.xml`` file, limits how many cycles can be active at one time. An example of how to use the ``rocotorewind`` command to rerun the forecast task from ``$EXPTDIR`` is: .. code-block:: console @@ -222,9 +221,9 @@ command to rerun the forecast task from ``$EXPTDIR`` is: rocotorewind -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 -c 201907010000 -t run_fcst rocotoboot -========== -``rocotoboot`` will force a specific task of a cycle in a Rocoto workflow to run. All dependencies and throttle -limits are ignored, and it is generally recommended to use ``rocotorewind`` instead. An example of how to +=========== +``rocotoboot`` will force a specific task of a cycle in a Rocoto workflow to run. All dependencies and throttle +limits are ignored, and it is generally recommended to use ``rocotorewind`` instead. An example of how to use this command to rerun the ``make_ics`` task from ``$EXPTDIR`` is: .. code-block:: console diff --git a/docs/UsersGuide/source/SRW_NATLEV_table.csv b/docs/UsersGuide/source/SRW_NATLEV_table.csv new file mode 100644 index 0000000000..ba82e4e3d6 --- /dev/null +++ b/docs/UsersGuide/source/SRW_NATLEV_table.csv @@ -0,0 +1,210 @@ +No.,Field Description,Level Type,Short Name,nlvl +1,Height on pressure surface,isobaric,HGT,4 +2,Temperature on pressure surface,isobaric,TMP,5 +3,Relative humidity on pressure surface,isobaric,RH,4 +4,U component of wind on pressure surface,isobaric,UGRD,4 +5,V component of wind on pressure surface,isobaric,VGRD,4 +6,Omega on pressure surface,isobaric,VVEL,4 +7,Specific humidity on pressure surface,isobaric,SPFH,4 +8,Absolute vorticity on pressure surface,isobaric,ABSV,4 +9,Pressure on model surface,hybrid,PRES,64 +10,Height on model surface,hybrid,HGT,64 +11,Temperature on model surface,hybrid,TMP,64 +12,Specific humidity on model surface,hybrid,SPFH,64 +13,U component of wind on model surface,hybrid,UGRD,64 +14,V component of wind on model surface,hybrid,VGRD,64 +15,Omega on model surface,hybrid,VVEL,64 +16,Vertical velocity on model surface,hybrid,DZDT,64 +17,Turbulent kinetic energy on model surface,hybrid,TKE,64 +18,Temperature tendency from grid scale latent heat release (time-averaged),hybrid,LRGHR,64 +19,Mesinger (Membrane) sea level pressure,mean sea level,MSLET,1 +20,Shuell sea level pressure,mean sea level,PRES,1 +21,Temperature at 2m,height agl,TMP,1 +22,Specific humidity at 2m,height agl,SPFH,1 +23,Dew point temperature at 2m,height agl,DPT,1 +24,Relative humidity at 2m,height agl,RH,1 +25,U component of wind at 10m,height agl,UGRD,1 +26,V component of wind at 10m,height agl,VGRD,1 +27,Potential temperature at 10m,height agl,POT,1 +28,Specific humidity at 10m,height agl,SPFH,1 +29,Surface Pressure,surface,PRES,1 +30,Terrain height,surface,HGT,1 +31,Skin potential temperature,surface,POT,1 +32,Skin specific humidity,surface,SPFH,1 +33,Skin temperature,surface,TMP,1 +34,Maximum updraft vertical velocity (100-1000 hPa),isobaric,MAXUVV,1 +35,Maximum downdraft vertical velocity (100-1000 hPa),isobaric,MAXDVV,1 +36,Maximum updraft helicity (0-3 km),height agl,MXUPHL,1 +37,Maximum updraft helicity (2-5 km),height agl,MXUPHL,1 +38,Minimum updraft helicity (2-5 km),height agl,MNUPHL,1 +39,Minimum updraft helicity (0-3 km),height agl,MNUPHL,1 +40,Maximum relative vertical vorticity (0-1 km),height agl,RELV,1 +41,Maximum relative vertical vorticity at hybrid level 1,hybrid,RELV,1 +42,Maximum relative vertical vorticity (0-2 km),height agl,RELV,1 +43,Maximum U-component wind at 10m,height agl,MAXUW,1 +44,Maximum V-component wind at 10m,height agl,MAXVW,1 +45,Maximum derived radar reflectivity at 1 km,height agl,MAXREF,1 +46,Maximum derived radar reflectivity at -10 C,isothermal,MAXREF,1 +47,Radar reflectivity at -10 C,isothermal,REFD,1 +48,Maximum 2m temperature,height agl,TMAX,1 +49,Minimum 2m temperature,height agl,TMIN,1 +50,Maximum 2m RH,height agl,MAXRH,1 +51,Minimum 2m RH,height agl,MINRH,1 +52,Soil temperature in between each soil layer,depth below land surface,TSOIL,4 +53,Soil moisture in between each soil layer,depth below land surface,SOILW,4 +54,Total soil moisture,depth below land surface,SOILM,1 +55,Heat exchange coeff at surface,surface,SFEXC,1 +56,Vegetation cover,surface,VEG,1 +57,Soil moisture availability,depth below land surface,MSTAV,1 +58,Soil temperature at 3m,depth below land surface,TSOIL,1 +59,Ground heat flux (instantaneous),surface,GFLUX,1 +60,Plant canopy surface water,surface,CNWAT,1 +61,Snow water equivalent,surface,WEASD,1 +62,Lifted index—best,pressure above ground,4LFTX,1 +63,Column integrated precipitable water,entire atmosphere,PWAT,1 +64,Accumulated total precipitation,surface,APCP,1 +65,Accumulated grid-scale precipitation,surface,NCPCP,1 +66,Continuous accumulated total precipitation,surface,APCP,1 +67,Continuous accumulated grid-scale precipitation,surface,NCPCP,1 +68,Accumulated total snow melt,surface,SNOM,1 +69,Accumulated storm surface runoff,surface,SSRUN,1 +70,Accumulated base flow runoff,surface,BGRUN,1 +71,Categorical rain (instantaneous),surface,CRAIN,1 +72,Categorical snow (instantaneous),surface,CSNOW,1 +73,Categorical ice pellets (instantaneous),surface,CICEP,1 +74,Categorical freezing rain (instantaneous),surface,CFRZR,1 +75,Precipitation rate (instantaneous),surface,PRATE,1 +76,Fraction of frozen precipitation,surface,CPOFP,1 +77,Cloud water mixing ratio on model surface,hybrid,CLMR,64 +78,Cloud ice mixing ratio on model surface,hybrid,ICMR,64 +79,Graupel mixing ratio on mmodel surface,hybrid,GRLE,64 +80,Cloud fraction on model surface,hybrid,TCDC,64 +81,Rain mixing ratio on model surface,hybrid,RWMR,64 +82,Snow mixing ratio on model surface,hybrid,SNMR,64 +83,Rimming factor for Ferrier scheme on model surface,hybrid,RIME,64 +84,Total condensate for Ferrier scheme on mode surface,hybrid,TCOND,64 +85,Model level fraction of rain for Ferrier scheme,hybrid,FRAIN,64 +86,Model level fraction of ice for Ferrier scheme,hybrid,FICE,64 +87,Low level cloud fraction,low cloud layer,LCDC,1 +88,Mid level cloud fraction,mid cloud layer,MCDC,1 +89,High level cloud fraction,high cloud layer,HCDC,1 +90,Total cloud fraction,entire atmosphere,TCDC,1 +91,Total cloud fraction (time-averaged),entire atmosphere,TCDC,1 +92,stratospheric cloud fraction (time-averaged),entire atmosphere,CDLYR,1 +93,Outgoing surface shortwave radiation (instantaneous),surface,USWRF,1 +94,Outgoing surface longwave radiation (instantaneous),surface,ULWRF,1 +95,Incoming surface shortwave radiation (time-averaged),surface,DSWRF,1 +96,Incoming surface longwave radiation (time-averaged),surface,DLWRF,1 +97,Outgoing surface shortwave radiation (time-averaged),surface,USWRF,1 +98,Outgoing surface longwave radiation (time-averaged),surface,ULWRF,1 +99,Outgoing model top shortwave radiation (time-averaged),top of atmosphere,USWRF,1 +100,Outgoing model top longwave radiation (time-averaged),top of atmosphere,ULWRF,1 +101,Incoming surface shortwave radiation (instantaneous),surface,DSWRF,1 +102,Incoming surface longwave radiation (instantaneous),surface,DLWRF,1 +103,Clear sky incoming surface shortwave (instantaneous),surface,CSDSF,1 +104,Roughness length,surface,SFCR,1 +105,Friction velocity,surface,FRICV,1 +106,Surface drag coefficient,surface,CD,1 +107,Surface u wind stress,surface,UFLX,1 +108,Surface v wind stress,surface,VFLX,1 +109,Surface sensible heat flux (time-averaged),surface,SHTFL,1 +110,Ground heat flux (time-averaged),surface,GFLUX,1 +111,Snow phase change heat flux (time-averaged),surface,SNOHF,1 +112,Surface latent heat flux (time-averaged),surface,LHTFL,1 +113,Accumulated surface evaporation,surface,EVP,1 +114,Accumulated potential evaporation,surface,PEVAP,1 +115,Surface sensible heat flux (instantaneous),surface,SHTFL,1 +116,Surface latent heat flux (instantaneous),surface,LHTFL,1 +117,Latitude,surface,NLAT,1 +118,Longitude,surface,ELON,1 +119,Land sea mask (land=1 sea=0),surface,LAND,1 +120,Sea ice mask,surface,ICEC,1 +121,Mass point at eta surface mask,surface,LMH,1 +122,Velocity point at eta surface mask,surface,LMV,1 +123,Surface albedo,surface,ALBDO,1 +124,Sea surface temperature,surface,WTMP,1 +125,Pressure in boundary layer (30 mb means),pressure agl,PRES,6 +126,Temperature in boundary layer (30 mb means),pressure agl,TMP,6 +127,Potential temperature in boundary layer (30 mb means),pressure agl,POT,1 +128,Dew point temperature in boundary layer (30 mb means),pressure agl,DPT,1 +129,Specific humidity in boundary layer (30 mb means),pressure agl,SPFH,6 +130,RH in boundary layer (30 mb means),pressure agl,RH,6 +131,Moisture convergence in boundary layer (30 mb means),pressure agl,MCONV,1 +132,Precipitable water in boundary layer (30 mb means),pressure agl,PWAT,1 +133,U wind in boundary layer (30 mb means),pressure agl,UGRD,6 +134,V wind in boundary layer (30 mb means),pressure agl,VGRD,6 +135,Accumulated land surface model precipitation,surface,LSPA,1 +136,Model top pressure,top of atmosphere,PRES,1 +137,Pressure thickness,hybrid,PRES,1 +138,Sigma pressure thickness,hybrid,PRES,1 +139,Plant canopy surface water,surface,CNWAT,1 +140,Ground heat flux (instantaneous),surface,GFLUX,1 +141,Lifted index—surface based (500-1000 hPa),isobaric,LFTX,1 +142,Convective available potential energy,surface,CAPE,1 +143,Best cape,pressure above ground,CAPE,1 +144,Mixed layer cape,pressure above ground,CAPE,1 +145,Unstable cape,pressure above ground,CAPE,1 +146,Convective inhibition,surface,CIN,1 +147,Best cin,pressure above ground,CIN,1 +148,Mixed layer cin,pressure above ground,CIN,1 +149,Unstable cin,pressure above ground,CIN,1 +150,LCL level pressure,pressure layer agl,PLPL,1 +151,Helicity,height agl,HLCY,2 +152,U component storm motion,height agl,USTM,1 +153,V component storm motion,height agl,VSTM,1 +154,Cloud bottom pressure,cloud base,PRES,1 +155,Cloud top pressure,cloud top,PRES,1 +156,Cloud top temperature,cloud top,TMP,1 +157,Pressure at tropopause,tropopause,PRES,1 +158,Height at tropopause,tropopause,HGT,1 +159,Temperature at tropopause,tropopause,TMP,1 +160,U component of wind at tropopause,tropopause,UGRD,1 +161,V component of wind at tropopause,tropopause,VGRD,1 +162,Wind shear at tropopause,tropopause,VWSH,1 +163,Temperature at flight levels,height msl,TMP,10 +164,U component of wind at flight levels,height msl,UGRD,10 +165,V component of wind at flight levels,height msl,VGRD,10 +166,Freezing level height,0 degree isotherm,HGT,1 +167,Freezing level relative humidity,0 degree isotherm,RH,1 +168,Highest freezing level height,highest tropospheric frz lvl,HGT,1 +169,Maximum wind pressure level,max wind,PRES,1 +170,Maximum wind height,max wind,HGT,1 +171,U-component of maximum wind,max wind,UGRD,1 +172,V-component of maximum wind,max wind,VGRD,1 +173,Maximum wind speed at 10m,height agl,WIND,1 +174,Cloud bottom height (above MSL),cloud base,HGT,1 +175,Cloud top height (above MSL),cloud top,HGT,1 +176,GSD visibility,surface,VIS,1 +177,Composite radar reflectivity,entire atmosphere,REFC,1 +178,Grid scale cloud bottom pressure,grid scale cloud bottom,PRES,1 +179,Grid scale cloud top pressure,grid scale cloud top,PRES,1 +180,Column integrated cloud water,entire atmosphere,TCOLW,1 +181,Column integrated cloud ice,entire atmosphere,TCOLI,1 +182,Column integrated rain,entire atmosphere,TCOLR,1 +183,Column integrated snow,entire atmosphere,TCOLS,1 +184,Column integrated total condensate,entire atmosphere,TCOLC,1 +185,Column integrated graupel,entire atmosphere,TCOLG,1 +186,Vegetation type,surface,VGTYP,1 +187,Soil type,surface,SOTYP,1 +188,Canopy conductance,surface,CCOND,1 +189,Planetary boundary layer height,surface,HPBL,1 +190,Snow depth,surface,SNOD,1 +191,Snow sublimation,surface,SBSNO,1 +192,Air dry soil moisture,surface,SMDRY,1 +193,Soil moist porosity,surface,POROS,1 +194,Minimum stomatal resistance,surface,RSMIN,1 +195,Number of root layers,surface,RLYRS,1 +196,Soil moist wilting point,surface,WILT,1 +197,Soil moist reference,surface,SMREF,1 +198,Canopy conductance - solar component,surface,RCS,1 +199,Canopy conductance - temperature component,surface,RCT,1 +200,Canopy conductance - humidity component,surface,RCQ,1 +201,Canopy conductance - soil component,surface,RCSOL,1 +202,Potential evaporation,surface,PEVPR,1 +203,Surface wind gust,surface,GUST,1 +204,Lowest wet bulb zero height,lowest lvl wet bulb zero,HGT,1 +205,Leaf area index,surface,LAI,1 +206,Clear sky incoming surface shortwave (instantaneous),surface,CSDSF,1 +207,Richardson number planetary boundary layer height,planetary boundary layer,HGT,1 +208,Mixing height,surface,MIXHT,1 +209,Time-averaged percentage snow cover,surface,SNOWC,1 \ No newline at end of file diff --git a/docs/UsersGuide/source/SRW_NATLEV_table.rst b/docs/UsersGuide/source/SRW_NATLEV_table.rst new file mode 100644 index 0000000000..436927355e --- /dev/null +++ b/docs/UsersGuide/source/SRW_NATLEV_table.rst @@ -0,0 +1,11 @@ +************************************************************ +Fields Requested in the UPP Parameter Table for SRW NATLEV +************************************************************ + +Field description (column 1), level type as defined by WMO (column 2), abbreviated names +as they appear in the Grib2 output file (column 3), and number of levels output (column 4). + +.. csv-table:: + :file: SRW_NATLEV_table.csv + :widths: 9, 40, 30, 15, 10 + :header-rows: 1 diff --git a/docs/UsersGuide/source/SRW_PRSLEV_table.csv b/docs/UsersGuide/source/SRW_PRSLEV_table.csv new file mode 100644 index 0000000000..581a43a5e4 --- /dev/null +++ b/docs/UsersGuide/source/SRW_PRSLEV_table.csv @@ -0,0 +1,257 @@ +No.,Field Description,Level Type,Short Name,nlvl +1,Pressure on model surface,hybrid,PRES,2 +2,Height on model surface,hybrid,HGT,2 +3,Temperature on model surface,hybrid,TMP,2 +4,Potential temperature on model surface,hybrid,POT,2 +5,Dew point temperature on model surface,hybrid,DPT,2 +6,Specific humidity on model surface,hybrid,SPFH,1 +7,Relative humidity on model surface,hybrid,RH,1 +8,U component of wind on model surface,hybrid,UGRD,2 +9,V component of wind on model surface,hybrid,VGRD,2 +10,Omega on model surface,hybrid,VVEL,1 +11,Vertical velocity on model surface,hybrid,DZDT,1 +12,Turbulent kinetic energy on model surface,hybrid,TKE,2 +13,Rain mixing ratio on model surface,hybrid,RWMR,2 +14,Snow mixing ratio on model surface,hybrid,SNMR,2 +15,Rimming factor for Ferrier scheme on model surface,hybrid,RIME,2 +16,Total condensate for Ferrier scheme on mode surface,hybrid,TCOND,2 +17,Radar reflectivity on model surface,hybrid,REFD,2 +18,Master length scale on model surface,hybrid,BMIXL,1 +19,Height on pressure surface,isobaric,HGT,45 +20,Temperature on pressure surface,isobaric,TMP,45 +21,Dew point temperature on pressure surface,isobaric,DPT,45 +22,Specific humidity on pressure surface,isobaric,SPFH,45 +23,Relative humidity on pressure surface,isobaric,RH,45 +24,Moisture convergence on pressure surface,isobaric,MCONV,2 +25,U component of wind on pressure surface,isobaric,UGRD,45 +26,V component of wind on pressure surface,isobaric,VGRD,45 +27,Vertical velocity on pressure surface,isobaric,DZDT,45 +28,Omega on pressure surface,isobaric,VVEL,45 +29,Absolute vorticity on pressure surface,isobaric,ABSV,10 +30,Geostrophic streamfunction on pressure surface,isobaric,STRM,2 +31,Turbulent kinetic energy on pressure surface,isobaric,TKE,45 +32,Cloud ice mixing ratio on pressure surface,isobaric,ICMR,45 +33,Cloud water mixing ratio on pressure surface,isobaric,CLMR,45 +34,Rain mixing ratio on pressure surface,isobaric,RWMR,45 +35,Graupel mixing ratio on pressure surface,isobaric,GRLE,45 +36,Snow mixing ratio on pressure surface,isobaric,SNMR,45 +37,Rimming factor for Ferrier scheme on pressure surface,isobaric,RIME,45 +38,Mesinger (Membrane) sea level pressure,mean sea level,MSLET,1 +39,Shuell sea level pressure,mean sea level,PRES,1 +40,Temperature at 2m,height agl,TMP,1 +41,Specific humidity at 2m,height agl,SPFH,1 +42,Dew point temperature at 2m,height agl,DPT,1 +43,Relative humidity at 2m,height agl,RH,1 +44,U component of wind at 10m,height agl,UGRD,1 +45,V component of wind at 10m,height agl,VGRD,1 +46,Surface wind gust,surface,GUST,1 +47,LCL level pressure,pressure layer agl,PLPL,1 +48,Potential temperature at 10m,height agl,POT,1 +49,Specific humidity at 10m,height agl,SPFH,1 +50,Surface Pressure,surface,PRES,1 +51,Terrain height,surface,HGT,1 +52,Skin potential temperature,surface,POT,1 +53,Skin specific humidity,surface,SPFH,1 +54,Skin temperature,surface,TMP,1 +55,Soil temperature at 3m,depth below land surface,TSOIL,1 +56,Soil temperature in between each soil layer,depth below land surface,TSOIL,4 +57,Soil moisture in between each soil layer,depth below land surface,SOILW,4 +58,Liquid soil moisture in between each soil layer,depth below land surface,SOILL,4 +59,Total soil moisture,depth below land surface,SOILM,1 +60,Plant canopy surface water,surface,CNWAT,1 +61,Snow water equivalent,surface,WEASD,1 +62,Snow cover in percentage,surface,SNOWC,1 +63,Heat exchange coeff at surface,surface,SFEXC,1 +64,Vegetation cover,surface,VEG,1 +65,Vegetation type,surface,VGTYP,1 +66,Soil type,surface,SOTYP,1 +67,Snow free albedo,surface,SNFALB,1 +68,Maximum snow albedo,surface,MXSALB,1 +69,Canopy conductance,surface,CCOND,1 +70,Canopy conductance - solar component,surface,RCS,1 +71,Canopy conductance - temperature component,surface,RCT,1 +72,Canopy conductance - humidity component,surface,RCQ,1 +73,Canopy conductance - soil component,surface,RCSOL,1 +74,Soil moist reference,surface,SMREF,1 +75,Soil moist porosity,surface,POROS,1 +76,Number of root layers,surface,RLYRS,1 +77,Minimum stomatal resistance,surface,RSMIN,1 +78,Snow depth,surface,SNOD,1 +79,Air dry soil moisture,surface,SMDRY,1 +80,Soil moist wilting point,surface,WILT,1 +81,Soil moisture availability,depth below land surface,MSTAV,1 +82,Ground heat flux (instantaneous),surface,GFLUX,1 +83,Lifted index—surface based (500-1000 hPa),isobaric,LFTX,1 +84,Lifted index—best,pressure above ground,4LFTX,1 +85,Lifted index—parcel,pressure above ground,PLI,1 +86,Convective available potential energy,surface,CAPE,1 +87,Best cape,pressure above ground,CAPE,1 +88,Mixed layer cape,pressure above ground,CAPE,1 +89,Unstable cape,pressure above ground,CAPE,1 +90,Convective inhibition,surface,CIN,1 +91,Best cin,pressure above ground,CIN,1 +92,Mixed layer cin,pressure above ground,CIN,1 +93,Unstable cin,pressure above ground,CIN,1 +94,Column integrated precipitable water,entire atmosphere,PWAT,1 +95,Helicity,height agl,HLCY,2 +96,U component storm motion,height agl,USTM,1 +97,V component storm motion,height agl,VSTM,1 +98,Accumulated total precipitation,surface,APCP,1 +99,Accumulated grid-scale precipitation,surface,NCPCP,1 +100,Continuous accumulated total precipitation,surface,APCP,1 +101,Continuous accumulated grid-scale precipitation,surface,NCPCP,1 +102,Accumulated total snow melt,surface,SNOM,1 +103,Accumulated storm surface runoff,surface,SSRUN,1 +104,Accumulated base flow runoff,surface,BGRUN,1 +105,Average water runoff,surface,WATR,1 +106,Categorical rain (instantaneous),surface,CRAIN,1 +107,Categorical snow (instantaneous),surface,CSNOW,1 +108,Categorical ice pellets (instantaneous),surface,CICEP,1 +109,Categorical freezing rain (instantaneous),surface,CFRZR,1 +110,Precipitation rate (instantaneous),surface,PRATE,1 +111,Fraction of frozen precipitation,surface,CPOFP,1 +112,Cloud water mixing ratio on model surface,hybrid,CLMR,2 +113,Cloud ice mixing ratio on model surface,hybrid,ICMR,2 +114,Graupel mixing ratio on model surface,hybrid,GRLE,1 +115,Cloud fraction on model surface,hybrid,TCDC,2 +116,Low level cloud fraction,low cloud layer,LCDC,1 +117,Mid level cloud fraction,mid cloud layer,MCDC,1 +118,High level cloud fraction,high cloud layer,HCDC,1 +119,Total cloud fraction,entire atmosphere,TCDC,1 +120,Total cloud fraction (time-averaged),entire atmosphere,TCDC,1 +121,stratospheric cloud fraction (time-averaged),entire atmosphere,CDLYR,1 +122,GSD visibility,cloud top,VIS,1 +123,Above-ground height of LCL,adiabatic condensation from surface,HGT,1 +124,Pressure of LCL,adiabatic condensation from surface,PRES,1 +125,Outgoing surface shortwave radiation (instantaneous),surface,USWRF,1 +126,Outgoing surface longwave radiation (instantaneous),surface,ULWRF,1 +127,Incoming surface shortwave radiation (time-averaged),surface,DSWRF,1 +128,Incoming surface longwave radiation (time-averaged),surface,DLWRF,1 +129,Outgoing surface shortwave radiation (time-averaged),surface,USWRF,1 +130,Outgoing surface longwave radiation (time-averaged),surface,ULWRF,1 +131,Outgoing model top shortwave radiation (time-averaged),top of atmosphere,USWRF,1 +132,Outgoing model top longwave radiation (time-averaged),top of atmosphere,ULWRF,1 +133,Outgoing longwave at top of atmosphere (instantaneous),top of atmosphere,ULWRF,1 +134,Total spectrum brightness temperature,top of atmosphere,BRTMP,1 +135,Incoming surface shortwave radiation (instantaneous),surface,DSWRF,1 +136,Incoming surface longwave radiation (instantaneous),surface,DLWRF,1 +137,Clear sky incoming surface shortwave (instantaneous),surface,CSDSF,1 +138,Roughness length,surface,SFCR,1 +139,Friction velocity,surface,FRICV,1 +140,Surface drag coefficient,surface,CD,1 +141,Surface u wind stress,surface,UFLX,1 +142,Surface v wind stress,surface,VFLX,1 +143,Surface sensible heat flux (time-averaged),surface,SHTFL,1 +144,Ground heat flux (time-averaged),surface,GFLUX,1 +145,Snow phase change heat flux (time-averaged),surface,SNOHF,1 +146,Surface latent heat flux (time-averaged),surface,LHTFL,1 +147,Accumulated surface evaporation,surface,EVP,1 +148,Accumulated potential evaporation,surface,PEVAP,1 +149,Surface sensible heat flux (instantaneous),surface,SHTFL,1 +150,Surface latent heat flux (instantaneous),surface,LHTFL,1 +151,Latitude,surface,NLAT,1 +152,Longitude,surface,ELON,1 +153,Land sea mask (land=1 sea=0),surface,LAND,1 +154,Sea ice mask,surface,ICEC,1 +155,Surface albedo,surface,ALBDO,1 +156,Sea surface temperature,surface,WTMP,1 +157,Pressure at tropopause,tropopause,PRES,1 +158,Height at tropopause,tropopause,HGT,1 +159,Temperature at tropopause,tropopause,TMP,1 +160,Potential temperature at tropopause,tropopause,POT,1 +161,U component of wind at tropopause,tropopause,UGRD,1 +162,V component of wind at tropopause,tropopause,VGRD,1 +163,Wind shear at tropopause,tropopause,VWSH,1 +164,U component of 0-1km level wind shear,height agl,VUCSH,1 +165,V-component of 0-1km level wind shear,height agl,VVCSH,1 +166,U component of 0-6km level wind shear,height agl,VUCSH,1 +167,V-component of 0-6km level wind shear,height agl,VVCSH,1 +168,Temperature at flight levels,height msl,TMP,10 +169,Temperature at flight levels,height agl,TMP,4 +170,U component of wind at flight levels,height msl,UGRD,10 +171,U component of wind at flight levels,height agl,UGRD,4 +172,V component of wind at flight levels,height msl,VGRD,10 +173,V component of wind at flight levels,height agl,VGRD,4 +174,Specific humidity at flight levels,height msl,SPFH,1 +175,Specific humidity at flight levels,height agl,SPFH,4 +176,Pressure at flight levels,height agl,PRES,4 +177,Freezing level height,0 degree isotherm,HGT,1 +178,Freezing level relative humidity,0 degree isotherm,RH,1 +179,Highest freezing level height,highest tropospheric frz lvl,HGT,1 +180,Lowest wet bulb zero height,lowest lvl wet bulb zero,HGT,1 +181,Pressure in boundary layer (30 mb means),pressure agl,PRES,6 +182,Temperature in boundary layer (30 mb means),pressure agl,TMP,6 +183,Potential temperature in boundary layer (30 mb means),pressure agl,POT,1 +184,Dew point temperature in boundary layer (30 mb means),pressure agl,DPT,1 +185,Specific humidity in boundary layer (30 mb means),pressure agl,SPFH,6 +186,RH in boundary layer (30 mb means),pressure agl,RH,6 +187,Moisture convergence in boundary layer (30 mb means),pressure agl,MCONV,1 +188,Precipitable water in boundary layer (30 mb means),pressure agl,PWAT,1 +189,U wind in boundary layer (30 mb means),pressure agl,UGRD,6 +190,V wind in boundary layer (30 mb means),pressure agl,VGRD,6 +191,Omega in boundary layer (30 mb means),pressure agl,VVEL,3 +192,Cloud bottom pressure,cloud base,PRES,1 +193,Cloud top pressure,cloud top,PRES,1 +194,Cloud top temperature,cloud top,TMP,1 +195,Cloud bottom height (above MSL),cloud base,HGT,1 +196,Cloud top height (above MSL),cloud top,HGT,1 +197,Maximum wind pressure level,max wind,PRES,1 +198,Maximum wind height,max wind,HGT,1 +199,U-component of maximum wind,max wind,UGRD,1 +200,V-component of maximum wind,max wind,VGRD,1 +201,Composite radar reflectivity,entire atmosphere,REFC,1 +202,Composite rain radar reflectivity,entire atmosphere,REFZR,1 +203,Composite ice radar reflectivity,entire atmosphere,REFZI,1 +204,Radar reflectivity at certain above ground heights,height agl,REFD,2 +205,Radar reflectivity from rain,height agl,REFZR,2 +206,Radar reflectivity from ice,height agl,REFZI,2 +207,Planetary boundary layer height,surface,HPBL,1 +208,Grid scale cloud bottom pressure,grid scale cloud bottom,PRES,1 +209,Grid scale cloud top pressure,grid scale cloud top,PRES,1 +210,Column integrated cloud water,entire atmosphere,TCOLW,1 +211,Column integrated cloud ice,entire atmosphere,TCOLI,1 +212,Column integrated rain,entire atmosphere,TCOLR,1 +213,Column integrated snow,entire atmosphere,TCOLS,1 +214,Column integrated total condensate,entire atmosphere,TCOLC,1 +215,Column integrated graupel,entire atmosphere,TCOLG,1 +216,Column integrated super cool liquid water,entire atmosphere,TCLSW,1 +217,Column integrated melting ice,entire atmosphere,TCOLM,1 +218,Height of lowest level super cool liquid water,lwst bot lvl of supercooled liq wtr,HGT,1 +219,Height of highest level super cool liquid water,hghst top lvl of supercooled liq wtr,HGT,1 +220,Ceiling height,cloud ceiling,HGT,1 +221,Accumulated land surface model precipitation,surface,LSPA,1 +222,Model top pressure,top of atmosphere,PRES,1 +223,Total column shortwave temperature tendency,entire atmosphere,SWHR,1 +224,Total column longwave temperature tendency,entire atmosphere,LWHR,1 +225,Total column gridded temperature tendency,entire atmosphere,LRGHR,1 +226,Column integrated moisture convergence,entire atmosphere,MCONV,1 +227,Planetary boundary layer regime,surface,PBLREG,1 +228,Transport wind u component,planetary boundary layer,UGRD,1 +229,Transport wind v component,planetary boundary layer,VGRD,1 +230,Richardson number planetary boundary layer height,planetary boundary layer,HGT,1 +231,Mixing height,surface,MIXHT,1 +232,Radar echo top,entire atmosphere,RETOP,1 +233,Ventilation rate,planetary boundary layer,VRATE,1 +234,Haines index,surface,HINDEX,1 +235,Maximum 2m temperature,height agl,TMAX,1 +236,Minimum 2m temperature,height agl,TMIN,1 +237,Maximum 2m RH,height agl,MAXRH,1 +238,Minimum 2m RH,height agl,MINRH,1 +239,Maximum U-component wind at 10m,height agl,MAXUW,1 +240,Maximum V-component wind at 10m,height agl,MAXVW,1 +241,Maximum wind speed at 10m,height agl,WIND,1 +242,Maximum 1km reflectivity,height agl,MAXREF,1 +243,Maximum updraft vertical velocity 100-1000 hPa,isobaric layer,MAXUVV,1 +244,Maximum downdraft vertical velocity 100-1000 hPa,isobaric layer,MAXDVV,1 +245,Lightning,surface,LTNG,1 +246,Radar derived vertically integrated liquid,entire atmosphere,VIL,1 +247,Updraft helicity (2-5 km),height agl,UPHL,1 +248,Maximum updraft helicity (2-5 km),height agl,MXUPHL,1 +249,Minimum updraft helicity (2-5 km),height agl,MNUPHL,1 +250,Minimum updraft helicity (0-3 km),height agl,MNUPHL,1 +251,Maximum updraft helicity (0-3 km),height agl,MXUPHL,1 +252,Maximum relative vertical vorticity (0-1 km),height agl,RELV,1 +253,Maximum relative vertical vorticity at hybrid level 1,hybrid,RELV,1 +254,Maximum relative vertical vorticity (0-2 km),height agl,RELV,1 +255,Maximum derived radar reflectivity at -10 C,isothermal,MAXREF,1 +256,Radar reflectivity at -10 C,isothermal,REFD,1 \ No newline at end of file diff --git a/docs/UsersGuide/source/SRW_PRSLEV_table.rst b/docs/UsersGuide/source/SRW_PRSLEV_table.rst new file mode 100644 index 0000000000..30048e2715 --- /dev/null +++ b/docs/UsersGuide/source/SRW_PRSLEV_table.rst @@ -0,0 +1,11 @@ +********************************************************** +Fields Requested in the UPP Parameter Table for SRW PRSLEV +********************************************************** + +Field description (column 1), level type as defined by WMO (column 2), abbreviated names +as they appear in the Grib2 output file (column 3), and number of levels output (column 4). + +.. csv-table:: + :file: SRW_PRSLEV_table.csv + :widths: 9, 40, 30, 15, 10 + :header-rows: 1 diff --git a/docs/UsersGuide/source/WE2Etests.rst b/docs/UsersGuide/source/WE2Etests.rst index 9bf062bf10..cb4b0a8673 100644 --- a/docs/UsersGuide/source/WE2Etests.rst +++ b/docs/UsersGuide/source/WE2Etests.rst @@ -1,212 +1,179 @@ .. _WE2E_tests: -================================ +================================== Workflow End-to-End (WE2E) Tests -================================ -The SRW App contains a set of end-to-end tests that exercise the App in various configurations. -These are referred to as workflow end-to-end (WE2E) tests because they all use the Rocoto -workflow manager to run their individual workflows. -The purpose of these tests is to -ensure that new changes to the App do not break existing functionality and capabilities. - - -Note that the WE2E tests are not regression tests, i.e. they do not check whether -current results are identical to previously established baselines. They are also -not tests of the scientific integrity of the results, e.g. they do not check that values -of output fields are reasonable. -These tests only check that the tasks within each test's workflow complete -successfully. -They are in essence tests of the workflow generation, task execution (j-jobs, +================================== +The SRW App contains a set of end-to-end tests that exercise various workflow configurations of the SRW App. These are referred to as workflow end-to-end (WE2E) tests because they all use the Rocoto +workflow manager to run their individual workflows. The purpose of these tests is to +ensure that new changes to the App do not break existing functionality and capabilities. + +Note that the WE2E tests are not regression tests---they do not check whether +current results are identical to previously established baselines. They also do +not test the scientific integrity of the results (e.g., they do not check that values +of output fields are reasonable). These tests only check that the tasks within each test's workflow complete successfully. They are, in essence, tests of the workflow generation, task execution (j-jobs, ex-scripts), and other auxiliary scripts (which are mostly in the ``regional_workflow`` -repo) to ensure that these scripts are correctly performing their functions. These functions +repository) to ensure that these scripts function correctly. These functions include creating and correctly arranging and naming directories and files, ensuring -that all input files are available and readable, calling -executables with correct namelists and/or options, etc. -For the time being, it is up to the external repositories that the App clones (:numref:`Section %s `) -to check that changes to those repos do not change results, or, if they do, to ensure that the new -results are acceptable. -(At least two of these external repositories --- ``UFS_UTILS`` and ``ufs-weather-model`` --- -do have such regression tests.) - -The script to run all or a subset of the WE2E tests is named ``run_WE2E_tests.sh`` and is located in the directory -``ufs-srweather-app/regional_workflow/tests/WE2E``. -Each WE2E test has associated with it a configuration file named ``config.${test_name}.sh``, -where ``${test_name}`` is the name of the corresponding test. -These configuration files are subsets of -the full ``config.sh`` experiment configuration file used in :numref:`Section %s ` -and described in :numref:`Section %s `. -For each test that the user wants -to run, the ``run_WE2E_tests.sh`` script reads in its configuration file and generates from -it a complete ``config.sh`` file. It then calls ``generate_FV3LAM_wflow.sh``, which in turn -reads in ``config.sh`` and generates a new experiment for the test. -The name of each experiment directory is set to that of the corresponding test, -and a copy of ``config.sh`` for each test is placed in its experiment directory. +that all input files are available and readable, calling executables with correct namelists and/or options, etc. Currently, it is up to the external repositories that the App clones (:numref:`Section %s `) to check that changes to those repos do not change results, or, if they do, to ensure that the new results are acceptable. (At least two of these external repositories---``UFS_UTILS`` and ``ufs-weather-model``---do have such regression tests.) For convenience, the WE2E tests are currently grouped into the following categories: * ``grids_extrn_mdls_suites_community`` - - This category of tests ensures that the SRW App workflow running in **community mode** - (i.e. with ``RUN_ENVIR`` set to ``"community"``) completes successfully for various - combinations of predefined grids, physics suites, and external models for ICs and LBCs. - Note that in community mode, all output from the App is placed under a single experiment - directory. + This category of tests ensures that the SRW App workflow running in **community mode** (i.e. with ``RUN_ENVIR`` set to ``"community"``) completes successfully for various combinations of predefined grids, physics suites, and external models for ICs and LBCs. Note that in community mode, all output from the App is placed under a single experiment directory. * ``grids_extrn_mdls_suites_nco`` - - This category of tests ensures that the workflow running in **NCO mode** - (i.e. with ``RUN_ENVIR`` set to ``"nco"``) completes successfully for various combinations - of predefined grids, physics suites, and external models for ICs and LBCs. Note that - in NCO mode, an operational run environment (i.e. variable names) is used along with - a directory structure in which output is placed in multiple directories (see - Sections :numref:`%s ` and :numref:`%s `). + This category of tests ensures that the workflow running in **NCO mode** (i.e. with ``RUN_ENVIR`` set to ``"nco"``) completes successfully for various combinations of predefined grids, physics suites, and external models for ICs and LBCs. Note that in NCO mode, an operational run environment (i.e. variable names) is used along with a directory structure in which output is placed in multiple directories (see + Sections :numref:`%s ` and :numref:`%s `). * ``wflow_features`` - This category of tests ensures that the workflow with various features/capabilities activated - completes successfully. To reduce computational cost, most tests in this category - use coarser grids. + This category of tests ensures that the workflow with various features/capabilities activated + completes successfully. To reduce computational cost, most tests in this category use coarser grids. -The test configuration files for these categories are located in the following directories, -respectively: +The test configuration files for these categories are located in the following directories, respectively: .. code-block:: - ufs-srweather-app/regional_workflow/tests/WE2E/test_configs/grids_extrn_mdls_suites_community - ufs-srweather-app/regional_workflow/tests/WE2E/test_configs/grids_extrn_mdls_suites_nco - ufs-srweather-app/regional_workflow/tests/WE2E/test_configs/wflow_features + ufs-srweather-app/regional_workflow/tests/WE2E/test_configs/grids_extrn_mdls_suites_community + ufs-srweather-app/regional_workflow/tests/WE2E/test_configs/grids_extrn_mdls_suites_nco + ufs-srweather-app/regional_workflow/tests/WE2E/test_configs/wflow_features + +The script to run the WE2E tests is named ``run_WE2E_tests.sh`` and is located in the directory ``ufs-srweather-app/regional_workflow/tests/WE2E``. Each WE2E test has an associated configuration file named ``config.${test_name}.sh``, where ``${test_name}`` is the name of the corresponding test. These configuration files are subsets of the full ``config.sh`` experiment configuration file used in :numref:`Section %s ` and described in :numref:`Section %s `. For each test, the ``run_WE2E_tests.sh`` script reads in its configuration file and generates from it a complete ``config.sh`` file. It then calls ``generate_FV3LAM_wflow.sh``, which in turn reads in ``config.sh`` and generates a new experiment for the test. The name of each experiment directory is set to that of the corresponding test, and a copy of ``config.sh`` for each test is placed in its experiment directory. Since ``run_WE2E_tests.sh`` calls ``generate_FV3LAM_wflow.sh`` for each test, the Python modules required for experiment generation must be loaded before ``run_WE2E_tests.sh`` -can be called. See :numref:`Section %s ` for information on loading the Python -environment on supported platforms. Note also that ``run_WE2E_tests.sh`` assumes that all of -the executables have been built. If they have not, then ``run_WE2E_tests.sh`` will still +can be called. See :numref:`Section %s ` for information on loading the Python +environment on supported platforms. Note also that ``run_WE2E_tests.sh`` assumes that all of +the executables have been built. If they have not, then ``run_WE2E_tests.sh`` will still generate the experiment directories, but the workflows will fail. +Supported Tests +=================== + +The full list of WE2E tests is extensive; it is not recommended to run all of the tests, as some are computationally expensive. A subset of the tests are supported for the latest release of the SRW Application`. Frequent test cases appear in :numref:`Table %s ` below, and complete test cases appear :doc:`here `. Frequent tests are a lightweight set of tests that can be automated and run regularly on each Level 1 platform. These tests and cover a wide scope of capabilities to ensure that there are no major, obvious faults in the underlying code. Complete tests include the remainder of the supported WE2E tests and cover a fuller list of workflow configurations. + +.. + COMMENT: Add file w/supported tests to repo + COMMENT: Contrib Guide says that "Fundamental testing consists of a lightweight set of tests that can be automated and run regularly on each Level 1 platform. These are mostly low-resolution tests and cover a wide scope of capabilities to ensure that there are no major, obvious faults in the underlying code. Comprehensive testing includes the entire set of WE2E tests." How would we define frequent v complete? + +.. _FrequentTests: + +.. table:: Frequent Test Cases (Supported) + + +------------------+--------+--------+---------------+--------------+---------------------------------------------------------------------+ + | Grid | ICs | LBCs | Physics Suite | Dataset Used | Script Name | + +==================+========+========+===============+==============+=====================================================================+ + | CONUS_25km | FV3GFS | FV3GFS | GFS_v16 | 2019-07-01 | config.grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh | + +------------------+--------+--------+---------------+--------------+---------------------------------------------------------------------+ + | CONUS_13km | FV3GFS | FV3GFS | GFS_v16 | 2019-07-01 | config.grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh | + +------------------+--------+--------+---------------+--------------+---------------------------------------------------------------------+ + | INDIANAPOLIS_3km | FV3GFS | FV3GFS | GFS_v16 | 2019-06-15 | config.SUBCONUS_Ind_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh | + +------------------+--------+--------+---------------+--------------+---------------------------------------------------------------------+ + | INDIANAPOLIS_3km | HRRR | RAP | RRFS_v1beta | 2019-06-15 | config.SUBCONUS_Ind_3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh | + +------------------+--------+--------+---------------+--------------+---------------------------------------------------------------------+ + | INDIANAPOLIS_3km | HRRR | RAP | HRRR | 2019-06-15 | config.SUBCONUS_Ind_3km_ics_HRRR_lbcs_RAP_suite_HRRR.sh | + +------------------+--------+--------+---------------+--------------+---------------------------------------------------------------------+ + | INDIANAPOLIS_3km | HRRR | RAP | WoFS | 2019-06-15 | config.SUBCONUS_Ind_3km_ics_HRRR_lbcs_RAP_suite_WoFS.sh | + +------------------+--------+--------+---------------+--------------+---------------------------------------------------------------------+ + Running the WE2E Tests ================================ -The user specifies the set of tests that ``run_WE2E_tests.sh`` will run by creating a text -file, say ``my_tests.txt``, that contains a list of the WE2E tests to run (one per line) -and passing the name of that file to ``run_WE2E_tests.sh``. For example, if the user -wants to run the tests ``new_ESGgrid`` and ``grid_RRFS_CONUScompact_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16`` -(from the ``wflow_features`` and ``grids_extrn_mdls_suites_community`` categories, respectively), we would have: +Users may specify the set of tests to run by creating a text file, such as ``my_tests.txt``, which contains a list of the WE2E tests to run (one per line). Then, they pass the name of that file to ``run_WE2E_tests.sh``. For example, to run the tests ``new_ESGgrid`` and ``grid_RRFS_CONUScompact_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16`` (from the ``wflow_features`` and ``grids_extrn_mdls_suites_community`` categories, respectively), users would enter the following: .. code-block:: console - > cat my_tests.txt - new_ESGgrid - grid_RRFS_CONUScompact_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16 + > cat my_tests.txt + new_ESGgrid + grid_RRFS_CONUScompact_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16 -For each test in ``my_tests.txt``, ``run_WE2E_tests.sh`` -will generate a new experiment directory and, by default, create a new cron job in the user's cron -table that will (re)launch the workflow every 2 minutes. This cron job calls the workflow launch script -``launch_FV3LAM_wflow.sh`` located in the experiment directory until the workflow either -completes successfully (i.e. all tasks are successful) or fails (i.e. at least one task fails). +For each test in ``my_tests.txt``, ``run_WE2E_tests.sh`` will generate a new experiment directory and, by default, create a new cron job in the user's cron table that will (re)launch the workflow every 2 minutes. This cron job calls the workflow launch script ``launch_FV3LAM_wflow.sh`` until the workflow either completes successfully (i.e., all tasks are successful) or fails (i.e., at least one task fails). The cron job is then removed from the user's cron table. -Next, we show several common ways that ``run_WE2E_tests.sh`` can be called with -the ``my_tests.txt`` file above. +The examples below demonstrate several common ways that ``run_WE2E_tests.sh`` can be called with the ``my_tests.txt`` file above. -1) To run the tests listed in ``my_tests.txt`` on Hera and charge the computational +#. To run the tests listed in ``my_tests.txt`` on Hera and charge the computational resources used to the "rtrr" account, use: .. code-block:: - > run_WE2E_tests.sh tests_file="my_tests.txt" machine="hera" account="rtrr" - - This will create the experiment subdirectories for the two tests in - the directory - - .. code-block:: - - ${SR_WX_APP_TOP_DIR}/../expt_dirs + > run_WE2E_tests.sh tests_file="my_tests.txt" machine="hera" account="rtrr" - where ``SR_WX_APP_TOP_DIR`` is the directory in which the ufs-srweather-app - repository is cloned (usually set to something like ``/path/to/ufs-srweather-app``). + This will create the experiment subdirectories for the two sample WE2E tests in the directory ``${SR_WX_APP_TOP_DIR}/../expt_dirs``, where ``SR_WX_APP_TOP_DIR`` is the directory in which the ufs-srweather-app repository is cloned (usually set to something like ``/path/to/ufs-srweather-app``). Thus, the following two experiment directories will be created: .. code-block:: - ${SR_WX_APP_TOP_DIR}/../expt_dirs/new_ESGgrid - ${SR_WX_APP_TOP_DIR}/../expt_dirs/grid_RRFS_CONUScompact_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16 + ${SR_WX_APP_TOP_DIR}/../expt_dirs/new_ESGgrid + ${SR_WX_APP_TOP_DIR}/../expt_dirs/grid_RRFS_CONUScompact_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16 - In addition, by default, cron jobs will be added to the user's cron - table to relaunch the workflows of these experiments every 2 minutes. + In addition, by default, cron jobs will be added to the user's cron table to relaunch the workflows of these experiments every 2 minutes. -2) To change the frequency with which the cron relaunch jobs are submitted +#. To change the frequency with which the cron relaunch jobs are submitted from the default of 2 minutes to 1 minute, use: .. code-block:: - > run_WE2E_tests.sh tests_file="my_tests.txt" machine="hera" account="rtrr" cron_relaunch_intvl_mnts="01" + > run_WE2E_tests.sh tests_file="my_tests.txt" machine="hera" account="rtrr" cron_relaunch_intvl_mnts="01" -3) To disable use of cron (which implies the worfkow for each test will - have to be relaunched manually from within each experiment directory), - use: +#. To disable use of cron (which implies the worfkow for each test will have to be relaunched manually from within each experiment directory), use: .. code-block:: - > run_WE2E_tests.sh tests_file="my_tests.txt" machine="hera" account="rtrr" use_cron_to_relaunch="FALSE" + > run_WE2E_tests.sh tests_file="my_tests.txt" machine="hera" account="rtrr" use_cron_to_relaunch="FALSE" - In this case, the user will have to go into each test's experiment directory and - either manually call the ``launch_FV3LAM_wflow.sh`` script or use the Rocoto commands described - in :numref:`Chapter %s ` to (re)launch the workflow. Note that if using the Rocoto - commands directly, the log file ``log.launch_FV3LAM_wflow`` will not be created; in this case, - the status of the workflow can be checked using the ``rocotostat`` command (see :numref:`Chapter %s `). + In this case, the user will have to go into each test's experiment directory and either manually call the ``launch_FV3LAM_wflow.sh`` script or use the Rocoto commands described in :numref:`Chapter %s ` to (re)launch the workflow. Note that if using the Rocoto commands directly, the log file ``log.launch_FV3LAM_wflow`` will not be created; in this case, the status of the workflow can be checked using the ``rocotostat`` command (see :numref:`Chapter %s `). -4) To place the experiment subdirectories in a subdirectory named ``test_set_01`` under +#. To place the experiment subdirectories in a subdirectory named ``test_set_01`` under ``${SR_WX_APP_TOP_DIR}/../expt_dirs`` (instead of immediately under the latter), use: .. code-block:: - > run_WE2E_tests.sh tests_file="my_tests.txt" machine="hera" account="rtrr" expt_basedir="test_set_01" + > run_WE2E_tests.sh tests_file="my_tests.txt" machine="hera" account="rtrr" expt_basedir="test_set_01" In this case, the full paths to the experiment directories will be: .. code-block:: - ${SR_WX_APP_TOP_DIR}/../expt_dirs/test_set_01/new_ESGgrid - ${SR_WX_APP_TOP_DIR}/../expt_dirs/test_set_01/grid_RRFS_CONUScompact_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16 + ${SR_WX_APP_TOP_DIR}/../expt_dirs/test_set_01/new_ESGgrid + ${SR_WX_APP_TOP_DIR}/../expt_dirs/test_set_01/grid_RRFS_CONUScompact_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16 This is useful for grouping various sets of tests. -5) To use a test list file (again named ``my_tests.txt``) located in ``/path/to/custom/location`` - instead of in the same directory as ``run_WE2E_tests.sh``, and to have the experiment directories - be placed in an arbitrary location, say ``/path/to/custom/expt_dirs``, use: +#. To use a test list file (again named ``my_tests.txt``) located in ``/path/to/custom/location`` instead of in the same directory as ``run_WE2E_tests.sh``, and to have the experiment directories be placed in an arbitrary location, say ``/path/to/custom/expt_dirs``, use: .. code-block:: - > run_WE2E_tests.sh tests_file="/path/to/custom/location/my_tests.txt" machine="hera" account="rtrr" expt_basedir="/path/to/custom/expt_dirs" + > run_WE2E_tests.sh tests_file="/path/to/custom/location/my_tests.txt" machine="hera" account="rtrr" expt_basedir="/path/to/custom/expt_dirs" The full usage statement for ``run_WE2E_tests.sh`` is as follows: .. code-block:: - run_WE2E_tests.sh \ - tests_file="..." \ - machine="..." \ - account="..." \ - [expt_basedir="..."] \ - [exec_subdir="..."] \ - [use_cron_to_relaunch="..."] \ - [cron_relaunch_intvl_mnts="..."] \ - [verbose="..."] \ - [machine_file="..."] \ - [stmp="..."] \ - [ptmp="..."] \ - [compiler="..."] \ - [build_env_fn="..."] - -The arguments in brackets are optional. A complete description of these arguments can be + run_WE2E_tests.sh \ + tests_file="..." \ + machine="..." \ + account="..." \ + [expt_basedir="..."] \ + [exec_subdir="..."] \ + [use_cron_to_relaunch="..."] \ + [cron_relaunch_intvl_mnts="..."] \ + [verbose="..."] \ + [machine_file="..."] \ + [stmp="..."] \ + [ptmp="..."] \ + [compiler="..."] \ + [build_env_fn="..."] + +The arguments in brackets are optional. A complete description of these arguments can be obtained by issuing .. code-block:: - run_WE2E_tests.sh --help + run_WE2E_tests.sh --help in the directory ``ufs-srweather-app/regional_workflow/tests/WE2E``. @@ -219,11 +186,10 @@ In addition to creating the WE2E tests' experiment directories and optionally cr cron jobs to launch their workflows, the ``run_WE2E_tests.sh`` script generates (if necessary) a CSV (Comma-Separated Value) file named ``WE2E_test_info.csv`` that contains information on the full set of WE2E tests. This file serves as a single location where relevant -information about the WE2E tests can be found. It can be imported into Google Sheets -using the "|" (pipe symbol) character as the custom field separator. The rows of the +information about the WE2E tests can be found. It can be imported into Google Sheets +using the "|" (pipe symbol) character as the custom field separator. The rows of the file/sheet represent the full set of available tests (not just the ones to be run), -while the columns contain the following information (column titles are included in the -CSV file): +while the columns contain the following information (column titles are included in the CSV file): | Column 1: | The primary test name and (in parentheses) the category subdirectory it is @@ -238,14 +204,14 @@ CSV file): | Column 4: | The number of times the forecast model will be run by the test. This gives an idea - of how expensive the test is. It is calculated using quantities such as the number + of how expensive the test is. It is calculated using quantities such as the number of cycle dates (i.e. forecast model start dates) and the number of of ensemble members - (if running ensemble forecasts). The are in turn obtained directly or indirectly + (if running ensemble forecasts). The are in turn obtained directly or indirectly from the quantities in Columns 5, 6, .... | Columns 5,6,...: | The values of various experiment variables (if defined) in each test's configuration - file. Currently, the following experiment variables are included: + file. Currently, the following experiment variables are included: | ``PREDEF_GRID_NAME`` | ``CCPP_PHYS_SUITE`` @@ -262,38 +228,30 @@ CSV file): Additional fields (columns) will likely be added to the CSV file in the near future. Note that the CSV file is not part of the ``regional_workflow`` repo (i.e. it is -not tracked by the repo). The ``run_WE2E_tests.sh`` script will generate a CSV +not tracked by the repo). The ``run_WE2E_tests.sh`` script will generate a CSV file if (1) the CSV file doesn't already exist, or (2) the CSV file does exist -but changes have been made to one or more of the category subdirectories (e.g. +but changes have been made to one or more of the category subdirectories (e.g., test configuration files modified, added, or deleted) since the creation of the -CSV file. Thus, ``run_WE2E_tests.sh`` will always create a CSV file the first +CSV file. Thus, ``run_WE2E_tests.sh`` will always create a CSV file the first time it is run in a fresh git clone of the SRW App. - Checking Test Status -==================== -If cron jobs are being used to periodically relaunch the tests, the status of -each test can be checked by viewing the end of the log file ``log.launch_FV3LAM_wflow`` -(since the cron jobs use ``launch_FV3LAM_wflow.sh`` to relaunch the workflow, and -that in turn generates the log files). Otherwise (or alternatively), the ``rocotorun``/``rocotostat`` -combination of commands can be used. See :numref:`Section %s ` for -details. - -The App also provides the script ``get_expts_status.sh`` in the directory -``ufs-srweather-app/regional_workflow/tests/WE2E`` that can be used to generate -a status summary for all tests in a given base directory. This script updates +====================== +If cron jobs are used to periodically relaunch the tests, the status of each test can be checked by viewing the end of the log file ``log.launch_FV3LAM_wflow`` (since the cron jobs use ``launch_FV3LAM_wflow.sh`` to relaunch the workflow, and that in turn generates the log files). Otherwise (or alternatively), the ``rocotorun``/``rocotostat`` combination of commands can be used. See :numref:`Section %s ` for details. + +The SRW App also provides the script ``get_expts_status.sh`` in the directory +``ufs-srweather-app/regional_workflow/tests/WE2E``, which can be used to generate +a status summary for all tests in a given base directory. This script updates the workflow status of each test (by internally calling ``launch_FV3LAM_wflow.sh``) -and then prints out to screen the status of the various tests. It also creates +and then prints out to the command prompt the status of the various tests. It also creates a status report file named ``expts_status_${create_date}.txt`` (where ``create_date`` is a time stamp of the form ``YYYYMMDDHHmm`` corresponding to the creation date/time -of the report) and places it in the experiment base directory. This status file +of the report) and places it in the experiment base directory. This status file contains the last 40 lines (by default; this can be adjusted via the ``num_log_lines`` -argument) from the end of each -``log.launch_FV3LAM_wflow`` log file. These lines include the experiment status -as well as the task status table generated by ``rocotostat`` (so that, in +argument) from the end of each ``log.launch_FV3LAM_wflow`` log file. These lines include the experiment status as well as the task status table generated by ``rocotostat`` (so that, in case of failure, it is convenient to pinpoint the task that failed). -For details on the usage of ``get_expts_stats.sh``, issue +For details on the usage of ``get_expts_stats.sh``, issue the command: .. code-block:: @@ -326,17 +284,17 @@ Here is an example of how to call ``get_expts_status.sh`` along with sample outp ====================================== A status report has been created in: - expts_status_fp = "/path/to/expt_dirs/set01/expts_status_202204211440.txt" + expts_status_fp = "/path/to/expt_dirs/set01/expts_status_202204211440.txt" DONE. -The "Workflow status" field of each test indicates the status of its workflow. +The "Workflow status" field of each test indicates the status of its workflow. The values that this can take on are "SUCCESS", "FAILURE", and "IN PROGRESS". Modifying the WE2E System -========================= +============================ This section describes various ways in which the WE2E testing system can be modified to suit specific testing needs. @@ -344,10 +302,10 @@ to suit specific testing needs. .. _ModExistingTest: Modifying an Existing Test ---------------------- +----------------------------- To modify an existing test, simply edit the configuration file for that test by changing existing variable values and/or adding new variables to suit the requirements of the -modified test. Such a change may also require modifications to the test description +modified test. Such a change may also require modifications to the test description in the header of the file. @@ -356,31 +314,24 @@ in the header of the file. Adding a New Test --------------------- To add a new test named, for example ``new_test01``, to one of the existing categories listed -above, say ``wflow_features``: +above, such as ``wflow_features``: -1) Choose an existing test configuration file in any one of the category directories that - matches most closely the new test to be added. Copy that file to ``config.new_test01.sh`` - and, if necessary, move it to the ``wflow_features`` category directory. +#. Choose an existing test configuration file in any one of the category directories that matches most closely the new test to be added. Copy that file to ``config.new_test01.sh`` and, if necessary, move it to the ``wflow_features`` category directory. -2) Edit ``config.new_test01.sh`` so that the header containing the test description properly - describes the new test. +#. Edit ``config.new_test01.sh`` so that the header containing the test description properly describes the new test. -3) Further edit ``config.new_test01.sh`` by modifying existing experiment variable values - and/or adding new variables such that the test runs with the intended configuration. +#. Further edit ``config.new_test01.sh`` by modifying existing experiment variable values and/or adding new variables such that the test runs with the intended configuration. .. _AddNewCategory: Adding a New WE2E Test Category ------------------------------ -To create a new test category called, e.g. ``new_category``: +----------------------------------- +To create a new test category called, e.g., ``new_category``: -1) In the directory ``ufs-srweather-app/regional_workflow/tests/WE2E/test_configs``, - create a new directory named ``new_category``. +#. In the directory ``ufs-srweather-app/regional_workflow/tests/WE2E/test_configs``, create a new directory named ``new_category``. -2) In the file ``get_WE2Etest_names_subdirs_descs.sh``, add the element ``"new_category"`` - to the array ``category_subdirs`` that contains the list of categories/subdirectories - in which to search for test configuration files. Thus, ``category_subdirs`` becomes: +#. In the file ``get_WE2Etest_names_subdirs_descs.sh``, add the element ``"new_category"`` to the array ``category_subdirs`` that contains the list of categories/subdirectories in which to search for test configuration files. Thus, ``category_subdirs`` becomes: .. code-block:: console @@ -392,56 +343,47 @@ To create a new test category called, e.g. ``new_category``: "new_category" \ ) -New tests can now be added to ``new_category`` using the procedure described in -:numref:`Section %s `. +New tests can now be added to ``new_category`` using the procedure described in :numref:`Section %s `. .. _CreateAltTestNames: Creating Alternate Names for a Test ------------------------------------ -In order to prevent proliferation of WE2E tests, users might want to use the same -test for multiple purposes. For example, consider the test - - ``grid_RRFS_CONUScompact_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16`` - -in the ``grids_extrn_mdls_suites_community`` category. This checks for the successful -completion of the Rocoto workflow running the combination -of the ``RRFS_CONUScompact_25km`` grid, the ``FV3GFS`` model for -ICs and LBCs, and the ``FV3_GFS_v16`` physics suite. If this test also -happens to use the inline post capability of the weather model (it currently -doesn't; this is only a hypothetical example), then this test can also be used -to ensure that the inline post feature of the App/weather model (which is -activated in the App by setting ``WRITE_DOPOST`` to ``"TRUE"``) is working properly. -Since this test will serve two purposes, it should have two names --- one per purpose. -Assume we want to set the second (alternate) name to ``activate_inline_post``. This -can be accomplished by creating -a symlink named ``config.activate_inline_post.sh``, most appropriately in the ``wflow_features`` -category directory, that points to the configuration file +-------------------------------------- +To prevent proliferation of WE2E tests, users might want to use the same test for multiple purposes. For example, consider the test + + ``grid_RRFS_CONUScompact_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16`` + +in the ``grids_extrn_mdls_suites_community`` category. This checks for the successful +completion of the Rocoto workflow running the combination of the ``RRFS_CONUScompact_25km`` grid, the ``FV3GFS`` model for ICs and LBCs, and the ``FV3_GFS_v16`` physics suite. If this test also +happens to use the inline post capability of the weather model (it currently doesn't; this is only a hypothetical example), then this test can also be used to ensure that the inline post feature of the App/Weather Model (which is activated in the App by setting ``WRITE_DOPOST`` to ``"TRUE"``) is working properly. +Since this test will serve two purposes, it should have two names --- one per purpose. To set the second (alternate) name to ``activate_inline_post``, the user needs to create a symlink named ``config.activate_inline_post.sh`` in the ``wflow_features`` category directory that points to the configuration file ``config.grid_RRFS_CONUScompact_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh`` -in the ``grids_extrn_mdls_suites_community`` category directory. +in the ``grids_extrn_mdls_suites_community`` category directory. + +.. code-block:: console + + ln -fs --relative + In this situation, the primary name for the test is ``grid_RRFS_CONUScompact_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16`` -(because ``config.grid_RRFS_CONUScompact_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh`` -is an actual file, not a symlink), and ``activate_inline_post`` is an alternate name. -This approach of allowing multiple names for the same test makes it easier to identify -the multiple purposes that a test may serve. +(because ``config.grid_RRFS_CONUScompact_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh`` is an actual file, not a symlink), and ``activate_inline_post`` is an alternate name. This approach of allowing multiple names for the same test makes it easier to identify the multiple purposes that a test may serve. Note the following: -* A primary test can have more than one alternate test name (by having more than - one symlink point to the test's configuration file). -* The symlinks representing the alternate test names can be in the same or a - different category directory. +* A primary test can have more than one alternate test name (by having more than one symlink point to the test's configuration file). +* The symlinks representing the alternate test names can be in the same or a different category directory. +* The --relative flag makes the symlink relative (i.e., within/below the ``regional_workflow`` directory structure) so that it stays valid when copied to other locations. However, the ``--relative`` flag may be different and/or not exist on every platform. * To determine whether a test has one or more alternate names, a user can - view the CSV file ``WE2E_test_info.csv`` that ``run_WE2E_tests.sh`` generates. + view the CSV file ``WE2E_test_info.csv`` that ``run_WE2E_tests.sh`` generates. Recall from :numref:`Section %s ` that column 1 of this CSV file contains the test's primary name (and its category) while column 2 contains any alternate names (and their categories). * With this primary/alternate test naming convention, a user can list either the primary test name or one of the alternate test names in the experiments list file - (e.g. ``my_tests.txt``) that ``run_WE2E_tests.sh`` reads in. If both primary and + (e.g. ``my_tests.txt``) that ``run_WE2E_tests.sh`` reads in. If both primary and one or more alternate test names are listed, then ``run_WE2E_tests.sh`` will exit with a warning message without running any tests. + diff --git a/docs/UsersGuide/source/_static/FV3LAM_wflow_flowchart_v2.png b/docs/UsersGuide/source/_static/FV3LAM_wflow_flowchart_v2.png new file mode 100644 index 0000000000..545a8db938 Binary files /dev/null and b/docs/UsersGuide/source/_static/FV3LAM_wflow_flowchart_v2.png differ diff --git a/docs/UsersGuide/source/_static/SUBCONUS_Ind_3km.png b/docs/UsersGuide/source/_static/SUBCONUS_Ind_3km.png new file mode 100644 index 0000000000..8bbcbc7aa3 Binary files /dev/null and b/docs/UsersGuide/source/_static/SUBCONUS_Ind_3km.png differ diff --git a/docs/UsersGuide/source/_static/custom.css b/docs/UsersGuide/source/_static/custom.css index 25f207a223..c02df7fed2 100644 --- a/docs/UsersGuide/source/_static/custom.css +++ b/docs/UsersGuide/source/_static/custom.css @@ -4,3 +4,22 @@ font-weight: bold; font-style: italic; } + +.underline { + text-decoration: underline; +} + +.bolditalic { + font-weight: bold; + font-style: italic; +} + +table.align-default { + margin-left: 0px; + margin-right: auto; +} + +table.align-center { + margin-left: 0px; + margin-right: auto; +} diff --git a/docs/UsersGuide/source/_static/theme_overrides.css b/docs/UsersGuide/source/_static/theme_overrides.css index 9143850a43..c38958b095 100644 --- a/docs/UsersGuide/source/_static/theme_overrides.css +++ b/docs/UsersGuide/source/_static/theme_overrides.css @@ -7,9 +7,11 @@ white-space: normal !important; } - .wy-table-responsive { - overflow: visible !important; - } + /* .wy-table-responsive { */ + /* overflow: visible !important; */ + /* } */ } + + diff --git a/docs/UsersGuide/source/index.rst b/docs/UsersGuide/source/index.rst index 93f240d672..48ba5c40ca 100644 --- a/docs/UsersGuide/source/index.rst +++ b/docs/UsersGuide/source/index.rst @@ -13,16 +13,16 @@ UFS Short-Range Weather App Users Guide Introduction Quickstart + Non-ContainerQS BuildRunSRW Components Include-HPCInstall InputOutputFiles ConfigWorkflow LAMGrids - ConfigNewPlatform + RocotoInfo WE2Etests Graphics ContributorsGuide FAQ - RocotoInfo Glossary diff --git a/hpc-stack-mod b/hpc-stack-mod index 0199b163a2..dd5f1f53e5 160000 --- a/hpc-stack-mod +++ b/hpc-stack-mod @@ -1 +1 @@ -Subproject commit 0199b163a28d410524ebd9586699ca20620aa509 +Subproject commit dd5f1f53e5babe7b786c72a75e8898fb0debe78c