Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GFSv16 netcdf post ficein cpl #36

Conversation

DusanJovic-NOAA
Copy link
Collaborator

@DusanJovic-NOAA DusanJovic-NOAA commented Jan 24, 2020

DusanJovic-NOAA and others added 12 commits January 13, 2020 15:35
* Removing WW3 submodule to point to new branch

* Changing WW3 repo commit to latest in branch GEFS_v12

* Changing WW3 repo commit to latest hash in branch GEFS_v12

* Added GFSv16 operational grids to c768 compset. Changed resources required by waves for this compset to fit in 1800s runtime window.

* Adding dell for allowing running c768 compsets

* Adding parameter C768_THRD for expanding threads due to use of higher res GFSv16-wave grids in compset. Updating logs, adjusting c768 resources for Hera.

* Changing resource requirements of c768-ww3 compset to allow faster access to batch queues. Change aligns resource requirements to GFSv16 FV3 and WW3 configs.

* Adding logs after successfull runs with new resource settings for c768

* Setting up for updating WW3 submodule

*  Pointing submodule WW3 to latest develop hash.

* Updating to latest FV3 from dusan/gfsv16_netcdf_post_ficein_cpl

* Updating FMS and stochastic_physics module to latest hash

* Updating FV3 to match the develop branch. Previous update to dusan fv3atm repo caused conflict.

* Extending time limit for execution of c768 on Dell

* Updating to latest WW3 develop

* Adding new hera logs for ww3 compsets

* Adding new Cray logs for ww3 compsets

* Updating to latest WW3 develop: bugfixes to the wave components for running under the ufs-weather-model.
…chine config files (#4)

* Minor changes to cmake and make machine config files

* Update of GFSv15p2 and GFSv16beta regression tests
* Removing WW3 submodule to point to new branch

* Changing WW3 repo commit to latest in branch GEFS_v12

* Changing WW3 repo commit to latest hash in branch GEFS_v12

* Added GFSv16 operational grids to c768 compset. Changed resources required by waves for this compset to fit in 1800s runtime window.

* Adding dell for allowing running c768 compsets

* Adding parameter C768_THRD for expanding threads due to use of higher res GFSv16-wave grids in compset. Updating logs, adjusting c768 resources for Hera.

* Changing resource requirements of c768-ww3 compset to allow faster access to batch queues. Change aligns resource requirements to GFSv16 FV3 and WW3 configs.

* Adding logs after successfull runs with new resource settings for c768

* Setting up for updating WW3 submodule

*  Pointing submodule WW3 to latest develop hash.

* Updating to latest FV3 from dusan/gfsv16_netcdf_post_ficein_cpl

* Updating FMS and stochastic_physics module to latest hash

* Updating FV3 to match the develop branch. Previous update to dusan fv3atm repo caused conflict.

* Extending time limit for execution of c768 on Dell

* Updating to latest WW3 develop

* Adding new hera logs for ww3 compsets

* Adding new Cray logs for ww3 compsets

* Updating to latest WW3 develop: bugfixes to the wave components for running under the ufs-weather-model.

* Adding new WW3 develop hash. Changing to C768_THRD=1 to ensure b4b reproducibility of ww3 tests.
…ccpp_gfs_v15plus; fix tests fv3_gfs_v15p2, fv3_ccpp_gfs_v15p2, fv3_gfs_v16beta, fv3_ccpp_gfs_v16beta (#6)
.gitmodules Outdated Show resolved Hide resolved
# COMPILE | REPRO=Y CCPP=Y STATIC=Y SUITES=FV3_GFS_2017,FV3_GFS_2017_stretched 32BIT=Y | standard | wcoss_cray | |
# COMPILE | REPRO=Y CCPP=Y STATIC=Y SUITES=FV3_GFS_2017,FV3_GFS_2017_stretched 32BIT=Y | standard | wcoss_dell_p3 | |
# COMPILE | REPRO=Y CCPP=Y STATIC=Y SUITES=FV3_GFS_2017,FV3_GFS_2017_stretched 32BIT=Y | standard | hera.intel | |
# COMPILE | REPRO=Y CC
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought the REPRO and CCPP tests would be turned off with the next commit, not this one. I recall you ran them earlier this week, right? It would be good to confirm that they passed (minus the problems with the GFS_v15 and GFS_v15plus tests, which were my fault) so that when we bring these code changes to dtc/develop we can keep testing CCPP against IPD.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I was running full test until yesterday morning, when we made final change to IPD physics which changed the results, that's why we temporarily commented out ccpp tests.

56407bb

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great, thanks for letting me know. We'll do as much as we can to keep up with GFSv16 changes and test them in dtc/develop over the next weeks. No promises, though (depends a lot on UFS release work).

@DusanJovic-NOAA DusanJovic-NOAA merged commit 52795b8 into ufs-community:develop Jan 25, 2020
@DusanJovic-NOAA DusanJovic-NOAA deleted the gfsv16_netcdf_post_ficein_cpl branch January 25, 2020 00:40
climbfuji added a commit to climbfuji/ufs-weather-model that referenced this pull request Apr 10, 2020
climbfuji pushed a commit to climbfuji/ufs-weather-model that referenced this pull request Aug 14, 2020
LarissaReames-NOAA pushed a commit to LarissaReames-NOAA/ufs-weather-model that referenced this pull request Oct 22, 2021
Add a positive-definite advection option
epic-cicd-jenkins pushed a commit that referenced this pull request Apr 17, 2023
* Point Externals.cfg to release/public-v2 branch of UFS_UTILS

* Point Externals.cfg to develop branch of UFS_UTILS
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants