Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gfs ddt removal #493

Merged
merged 20 commits into from
Oct 9, 2020
Merged

Gfs ddt removal #493

merged 20 commits into from
Oct 9, 2020

Conversation

grantfirl
Copy link
Collaborator

@grantfirl grantfirl commented Aug 21, 2020

This pull request is part of a cleanup of ccpp-physics required to transition to the new CCPP code generator. It removes references to GFS-based (host-dependent) derived date types (DDTs) for all EXCEPT the following files (reasons in parentheses):

  • GFS_debug.F90 (this scheme is used for debugging purposes and needs access to the host-dependent DDTs in order to write out any/all of their components; without DDTs, the argument list could encompass ALL individual variables (several hundred))
  • GFS_phys_time_vary.fv3.F90, GFS_rad_time_vary.fv3.F90 (currently part of the "time_vary" step; will be moved to new time_step_init phase by @climbfuji; potentially needs access to data from all blocks)
  • GFS_phys_time_vary.scm.F90, GFS_rad_time_vary.scm.F90 (currently part of the "time_vary" step; will be moved to new time_step_init phase by @climbfuji; should match the FV3 versions as much as possible)
  • GFS_suite_interstitial.F90 (contains GFS_suite_interstitial_phys_reset_run and GFS_suite_interstitial_rad_reset_run that call type-bound reset procedures, so they need access to GFS_typedefs)
  • gcycle.F90 (currently part of the "time_vary" step; potentially needs access to data from all blocks)

This PR requires associated PRs in the host models since new metadata (and a few new array indices) are required:
NOAA-EMC/fv3atm#162
NCAR/ccpp-scm#197

A few other partial cleanup tasks were undertaken when editing files, such as, changing horizontal_domain to horizontal_loop_extent in the dimensions of the metadata, a few changed names (time step -> timestep), fixing ??? units, and using assumed-shape for some variables that I touched.

@grantfirl
Copy link
Collaborator Author

This supercedes #382.

Copy link
Collaborator

@climbfuji climbfuji left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure whether we should make changes to physics/GFS_phys_time_vary.scm.{F90,meta} and physics/GFS_rad_time_vary.scm.{F90,meta} at this point. For FV3, we will need to create timestep_init and timestep_final phases and run the time_vary group in the timestep_init phase (without threading over blocked data structures). It would probably be better to hold off changing the SCM version of those files and make the same changes as for FV3 later. The rest looks good. Changing several of the standard names (time_step to timestep) requires an fv3atm PR.

@grantfirl
Copy link
Collaborator Author

I am not sure whether we should make changes to physics/GFS_phys_time_vary.scm.{F90,meta} and physics/GFS_rad_time_vary.scm.{F90,meta} at this point. For FV3, we will need to create timestep_init and timestep_final phases and run the time_vary group in the timestep_init phase (without threading over blocked data structures). It would probably be better to hold off changing the SCM version of those files and make the same changes as for FV3 later. The rest looks good. Changing several of the standard names (time_step to timestep) requires an fv3atm PR.

Yes, I debated whether or not to do anything with the SCM time_vary files, but I was thinking that since the SCM isn't passing in the "all blocks" versions of the DDTs that it was OK to clean these up like other files. I know that you are going to fix the time_vary routines, but I had forgotten that you were going to implement them as the new phase. So, I was trying to save you some work, but I'll go ahead and remove these files.

As far as propagating GFS_typedef changes to fv3atm, I am planning on doing this. There are other changes besides the time_step -> timestep one that need to propagate.

@grantfirl
Copy link
Collaborator Author

@climbfuji, I'm still in the process of cleaning this up, BTW, hence the draft status. Regarding the time_vary stuff, the code will still need to exist even in the new timestep_init phase right? Will this phase still rely on GFS DDTs? If not, then perhaps some of the changes in these files can be used. I'll clean these up (formatting), commit that so that the files could be used if necessary and there will be a record of their existence in their cleaned up state, then commit the originals back.

@grantfirl
Copy link
Collaborator Author

Two questions:

One for @climbfuji: Based on your comment in another PR, every array in every CCPP entry point should be assumed shape going forward?

One for @dustinswales: This PR is adding the diagnostic fluxr array as a CCPP variable. If I were to uncomment the sections in GFS_rrtmgp_[sw,lw]_post.F90, and pass fluxr in to populate, would all of the antecedent variables be populated by RRTMGP, as far as you know, such that fluxr could be populated correctly?

@climbfuji
Copy link
Collaborator

Two questions:

One for @climbfuji: Based on your comment in another PR, every array in every CCPP entry point should be assumed shape going forward?

One for @dustinswales: This PR is adding the diagnostic fluxr array as a CCPP variable. If I were to uncomment the sections in GFS_rrtmgp_[sw,lw]_post.F90, and pass fluxr in to populate, would all of the antecedent variables be populated by RRTMGP, as far as you know, such that fluxr could be populated correctly?

It would be safer to use assumed-size arrays everywhere, yes. But it is not mandatory.

@dustinswales
Copy link
Collaborator

@grantfirl
Yes the fluxr would be populated correctly in the _post routines.

@climbfuji
Copy link
Collaborator

@grantfirl now that the "RRTMG cloud overlap method update" PR #487 was merged we need to update this PR. Do you have time to do this, or should I do it as part of merging your PR into the newly created feature/transition-to-capgen-1 branch?

@grantfirl
Copy link
Collaborator Author

grantfirl commented Sep 25, 2020 via email

@climbfuji
Copy link
Collaborator

climbfuji commented Sep 25, 2020 via email

@climbfuji
Copy link
Collaborator

@grantfirl I just realized that updating this PR isn't necessary, I'll do that anyway when I pull it into the feature/transition-to-capgen-1 branch. But I need to ask you to look at the CCPP tendencies PR ... will post a note there. Thanks!

@grantfirl
Copy link
Collaborator Author

@grantfirl I just realized that updating this PR isn't necessary, I'll do that anyway when I pull it into the feature/transition-to-capgen-1 branch. But I need to ask you to look at the CCPP tendencies PR ... will post a note there. Thanks!

Even though there are conflicting files in this PR?

@climbfuji
Copy link
Collaborator

@grantfirl I just realized that updating this PR isn't necessary, I'll do that anyway when I pull it into the feature/transition-to-capgen-1 branch. But I need to ask you to look at the CCPP tendencies PR ... will post a note there. Thanks!

Even though there are conflicting files in this PR?

I can solve these conflicts when I pull in the code, no problem.


real(kind=kind_phys), dimension(:), intent(inout) :: coszen, coszdg

real(kind=kind_phys), dimension(:,:), intent(inout) :: phy_f3d_leffr, &
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do these effective radii carry the prefix phy_f3d_ - this routine shouldn't care at all where/how they are stored in the host model?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very good question. I had no idea what I was thinking until I looked at the whole file again. I believe that my thought process was along the lines of this: due to the fact that this is an interstitial scheme that came from GFS_physics_driver, where DDTs were used nearly everywhere, for code continuity reasons, it may make sense to put where these variables came from in their Fortran names. This is a pretty weak argument, though. From a CCPP point of view, it probably makes sense to drop the phy_f3d part. I'll push a commit to remove the phy_f3d in Fortran names unless you buy the original reasoning.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree with dropping the phy_f3d part, thanks for the explanation. Can you please wait until my next PR is out there that merges your GFS DDT removal into the feature/transition-to-capgen-1 branch? This way I won't have to fix merge conflicts again. Thanks!

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I'll look out for that.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@climbfuji
Copy link
Collaborator

This PR was merged into #506 for NCAR's feature/transition-to-capgen-1 branch. These changes will be merged into the authoritative branch in a couple of weeks, and this PR should be flagged as "merged' automatically.

@climbfuji climbfuji merged commit 474144f into NCAR:master Oct 9, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants