Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[develop] Add process lightning task #644

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
96 changes: 96 additions & 0 deletions jobs/JREGIONAL_PROCESS_LIGHTNING
Original file line number Diff line number Diff line change
@@ -0,0 +1,96 @@
#!/bin/bash

#
#-----------------------------------------------------------------------
#
# This J-JOB script runs the NetCDF lightning observation preprocess
# for the FV3-LAM model
EdwardSnyder-NOAA marked this conversation as resolved.
Show resolved Hide resolved
#
#-----------------------------------------------------------------------
#
#
#-----------------------------------------------------------------------
#
# Source the variable definitions file and the bash utility functions.
#
#-----------------------------------------------------------------------
#
. ${USHdir}/source_util_funcs.sh
source_config_for_task "task_process_lightning" ${GLOBAL_VAR_DEFNS_FP}
. ${USHdir}/job_preamble.sh "TRUE"
#
#-----------------------------------------------------------------------
#
# Save current shell options (in a global array). Then set new options
# for this script/function.
#
#-----------------------------------------------------------------------
#
{ save_shell_opts; . $USHdir/preamble.sh; } > /dev/null 2>&1
#
#-----------------------------------------------------------------------
#
# Get the full path to the file in which this script/function is located
# (scrfunc_fp), the name of that file (scrfunc_fn), and the directory in
# which the file is located (scrfunc_dir).
#
#-----------------------------------------------------------------------
#
scrfunc_fp=$( $READLINK -f "${BASH_SOURCE[0]}" )
scrfunc_fn=$( basename "${scrfunc_fp}" )
scrfunc_dir=$( dirname "${scrfunc_fp}" )
#
#-----------------------------------------------------------------------
#
# Print message indicating entry into script.
#
#-----------------------------------------------------------------------
#
print_info_msg "
========================================================================
Entering script: \"${scrfunc_fn}\"
In directory: \"${scrfunc_dir}\"
This is the J-job script for the task that runs a lightning preprocess for
the specified cycle.
========================================================================"
#
#-----------------------------------------------------------------------
#
# Create the working directory under the cycle directory.
#
#-----------------------------------------------------------------------
#
if [ "${CYCLE_TYPE}" == "spinup" ]; then
DATA="${DATA:-${COMIN}/process_lightning_spinup}"
else
DATA="${DATA:-${COMIN}/process_lightning}"
fi
mkdir_vrfy -p ${DATA}
#
#-----------------------------------------------------------------------
#
# Call the ex-script for this J-job and pass to it the necessary varia-
# bles.
#
#-----------------------------------------------------------------------
#
$SCRIPTSdir/exregional_process_lightning.sh || print_err_msg_exit "\
Call to ex-script corresponding to J-job \"${scrfunc_fn}\" failed."
#
#-----------------------------------------------------------------------
#
# Run job postamble.
#
#-----------------------------------------------------------------------
#
job_postamble
#
#-----------------------------------------------------------------------
#
# Restore the shell options saved at the beginning of this script/func-
# tion.
#
#-----------------------------------------------------------------------
#
{ restore_shell_opts; } > /dev/null 2>&1
193 changes: 193 additions & 0 deletions scripts/exregional_process_lightning.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,193 @@
#!/bin/bash

#
#-----------------------------------------------------------------------
#
# Source the variable definitions file and the bash utility functions.
#
#-----------------------------------------------------------------------
#
. $USHdir/source_util_funcs.sh
source_config_for_task "task_process_lightning" ${GLOBAL_VAR_DEFNS_FP}
#
#-----------------------------------------------------------------------
#
# Save current shell options (in a global array). Then set new options
# for this script/function.
#
#-----------------------------------------------------------------------
#
{ save_shell_opts; . $USHdir/preamble.sh; } > /dev/null 2>&1
#
#-----------------------------------------------------------------------
#
# Get the full path to the file in which this script/function is located
# (scrfunc_fp), the name of that file (scrfunc_fn), and the directory in
# which the file is located (scrfunc_dir).
#
#-----------------------------------------------------------------------
#
scrfunc_fp=$( readlink -f "${BASH_SOURCE[0]}" )
scrfunc_fn=$( basename "${scrfunc_fp}" )
scrfunc_dir=$( dirname "${scrfunc_fp}" )
#
#-----------------------------------------------------------------------
#
# Print message indicating entry into script.
#
#-----------------------------------------------------------------------
#
print_info_msg "
========================================================================
Entering script: \"${scrfunc_fn}\"
In directory: \"${scrfunc_dir}\"
This is the ex-script for the task that runs lightning preprocessing
with FV3 for the specified cycle.
EdwardSnyder-NOAA marked this conversation as resolved.
Show resolved Hide resolved
========================================================================"
#
#-----------------------------------------------------------------------
#
# Extract from CDATE the starting year, month, day, and hour of the
# forecast. These are needed below for various operations.
#
#-----------------------------------------------------------------------
#
START_DATE=$(echo "${PDY} ${cyc}")
YYYYMMDDHH=$(date +%Y%m%d%H -d "${START_DATE}")

EdwardSnyder-NOAA marked this conversation as resolved.
Show resolved Hide resolved
#
#-----------------------------------------------------------------------
#
# Get into working directory
#
#-----------------------------------------------------------------------
#
print_info_msg "$VERBOSE" "
Getting into working directory for lightning process ..."

cd_vrfy ${DATA}

pregen_grid_dir=$DOMAIN_PREGEN_BASEDIR/${PREDEF_GRID_NAME}

print_info_msg "$VERBOSE" "pregen_grid_dir is $pregen_grid_dir"

#
#-----------------------------------------------------------------------
#
# link or copy background and grid files
#
#-----------------------------------------------------------------------

cp_vrfy ${pregen_grid_dir}/fv3_grid_spec fv3sar_grid_spec.nc

#-----------------------------------------------------------------------
#
# Link to the NLDN data
#
#-----------------------------------------------------------------------
run_lightning=false
filenum=0

for incr in $(seq -25 5 5) ; do
filedate=$(date +"%y%j%H%M" -d "${START_DATE} ${incr} minutes ")
filename=${LIGHTNING_ROOT}/${filedate}0005r
if [ -r ${filename} ]; then
((filenum += 1 ))
ln -sf ${filename} ./NLDN_lightning_${filenum}
run_lightning=true
else
echo " ${filename} does not exist"
fi
done
EdwardSnyder-NOAA marked this conversation as resolved.
Show resolved Hide resolved

echo "found GLD360 files: ${filenum}"

#-----------------------------------------------------------------------
#
# copy bufr table from fix directory
#
#-----------------------------------------------------------------------
BUFR_TABLE=${FIXgsi}/prepobs_prep_RAP.bufrtable

cp_vrfy $BUFR_TABLE prepobs_prep.bufrtable

#-----------------------------------------------------------------------
#
# Build namelist and run executable
#
# analysis_time : process obs used for this analysis date (YYYYMMDDHH)
# NLDN_filenum : number of NLDN lighting observation files
# IfAlaska : logic to decide if to process Alaska lightning obs
# bkversion : grid type (background will be used in the analysis)
# = 0 for ARW (default)
# = 1 for FV3LAM
#-----------------------------------------------------------------------

cat << EOF > namelist.lightning
&setup
analysis_time = ${YYYYMMDDHH},
NLDN_filenum = ${filenum},
grid_type = "${PREDEF_GRID_NAME}",
obs_type = "nldn_nc"
/
EOF

#
#-----------------------------------------------------------------------
#
# Copy the executable to the run directory.
#
#-----------------------------------------------------------------------
#
exec_fn="process_Lightning.exe"
exec_fp="$EXECdir/${exec_fn}"

if [ ! -f "${exec_fp}" ]; then
print_err_msg_exit "\
The executable specified in exec_fp does not exist:
exec_fp = \"${exec_fp}\"
Build lightning process and rerun."
fi
#
#
#-----------------------------------------------------------------------
#
# Run the process
#
#-----------------------------------------------------------------------
#

if [[ "$run_lightning" == true ]]; then
PREP_STEP
eval ${RUN_CMD_UTILS} ${exec_fp} ${REDIRECT_OUT_ERR} || \
print_err_msg_exit "\
Call to executable (exec_fp) to run lightning (nc) process returned
with nonzero exit code:
exec_fp = \"${exec_fp}\""
POST_STEP
fi

#
#-----------------------------------------------------------------------
#
# Print message indicating successful completion of script.
#
#-----------------------------------------------------------------------
#
print_info_msg "
========================================================================
LIGHTNING PROCESS completed successfully!!!
Exiting script: \"${scrfunc_fn}\"
In directory: \"${scrfunc_dir}\"
========================================================================"
#
#-----------------------------------------------------------------------
#
# Restore the shell options saved at the beginning of this script/func-
# tion.
#
#-----------------------------------------------------------------------
#
{ restore_shell_opts; } > /dev/null 2>&1
33 changes: 31 additions & 2 deletions ush/config_defaults.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -332,6 +332,7 @@ platform:
#
#-----------------------------------------------------------------------
#
MODEL: ""
MET_INSTALL_DIR: ""
MET_BIN_EXEC: ""
METPLUS_PATH: ""
Expand Down Expand Up @@ -397,6 +398,9 @@ platform:
# FIXsfc:
# System directory where surface climatology data is located
#
# FIXgsi:
# System directory where GSI fixed files are located
#
#-----------------------------------------------------------------------
#
FIXgsm: ""
Expand All @@ -405,6 +409,7 @@ platform:
FIXorg: ""
FIXsfc: ""
FIXshp: ""
FIXgsi: ""
#
#-----------------------------------------------------------------------
#
Expand Down Expand Up @@ -435,7 +440,17 @@ platform:
COMINgfs: ""
COMINgefs: ""
COMINairnow: "/path/to/real/time/airnow/data"

#
#-----------------------------------------------------------------------
#
# Setup default observation locations for data assimilation:
#
# LIGHTNING_ROOT: location of lightning observations
#
#-----------------------------------------------------------------------
#
LIGHTNING_ROOT: ""

#-----------------------------
# WORKFLOW config parameters
#-----------------------------
Expand Down Expand Up @@ -1031,6 +1046,21 @@ nco:
MAILTO: ""
MAILCC: ""

#----------------------------
# DO_ parameters. These look like workflow switches since some
# of them are used in FV3LAM_wflow.xml
#-----------------------------
rrfs:
#
#-----------------------------------------------------------------------
#
# DO_NLDN_LGHT
# Flag turn on processing NLDN NetCDF lightning data
#
#-----------------------------------------------------------------------
#
DO_NLDN_LGHT: false

Copy link
Collaborator Author

@EdwardSnyder-NOAA EdwardSnyder-NOAA Apr 12, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@christinaholtNOAA in #647 you mentioned that these do switches at this level and quantity shouldn't be how the workflow is configured, and that we should define the tasks based on its definition. If that's the case, do I need to include the DO_NLDN_LGHT variable?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm fine with it staying. I think we can assess whether it's needed for the workflow when it's integrated. One of these here or there can be handled. It's hard to balance/manage a bunch of them, though.

#----------------------------
# MAKE GRID config parameters
#-----------------------------
Expand Down Expand Up @@ -2364,4 +2394,3 @@ rocoto:
log: ""
tasks:
taskgroups: ""

6 changes: 3 additions & 3 deletions ush/machine/hera.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -37,8 +37,8 @@ platform:
FIXsfc: /scratch1/NCEPDEV/nems/role.epic/UFS_SRW_data/develop/fix/fix_sfc_climo
FIXshp: /scratch1/NCEPDEV/nems/role.epic/UFS_SRW_data/develop/NaturalEarth
EXTRN_MDL_DATA_STORES: hpss aws nomads


rocoto:
tasks:
metatask_run_ensemble:
Expand All @@ -49,7 +49,7 @@ rocoto:
nnodes:
nodesize:
ppn:

data:
obs:
RAP_obs: /scratch2/BMC/public/data/grids/rap/obs
Expand Down
1 change: 1 addition & 0 deletions ush/machine/jet.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@ platform:
FIXsfc: /mnt/lfs4/HFIP/hfv3gfs/role.epic/UFS_SRW_data/develop/fix/fix_sfc_climo
FIXshp: /mnt/lfs4/HFIP/hfv3gfs/role.epic/UFS_SRW_data/develop/NaturalEarth
EXTRN_MDL_DATA_STORES: hpss aws nomads
LIGHTNING_ROOT: /mnt/lfs4/HFIP/hfv3gfs/role.epic/UFS_SRW_data/develop/rrfs_retro_data/lightning/vaisala/netcdf
data:
ics_lbcs:
FV3GFS:
Expand Down
1 change: 1 addition & 0 deletions ush/valid_param_vals.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -85,3 +85,4 @@ valid_vals_DO_AQM_CHEM_LBCS: [True, False]
valid_vals_DO_AQM_GEFS_LBCS: [True, False]
valid_vals_DO_AQM_SAVE_AIRNOW_HIST: [True, False]
valid_vals_COLDSTART: [True, False]
valid_vals_DO_NLDN_LGHT: [True, False]
Loading