Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixes for the LUNA dayl bugs #961

Merged
merged 15 commits into from
Jun 26, 2020
Merged

Fixes for the LUNA dayl bugs #961

merged 15 commits into from
Jun 26, 2020

Conversation

ekluzek
Copy link
Collaborator

@ekluzek ekluzek commented Apr 2, 2020

Description of changes

Fixes two of the LUNA day length bugs that @lmbirch89 found. Use the night term for the night section rather than the day version. And instead of assuming a 12 max day length, use the actual max daylength.

Specific notes

Contributors other than yourself, if any: @lmbirch89

CTSM Issues Fixed (include github issue #): #953 #958

Fixes #953
Fixes #958

Are answers expected to change (and if so in what way)? Yes
(when LUNA on, which is always for CLM5.0, but NOT for CLM4.5)

Any User Interface Changes (namelist or namelist defaults changes)? No

Testing performed, if any: regular
Running regular testing now on cheyenne and izumi.
@olyson will also run a longer simulation with it.

@ekluzek
Copy link
Collaborator Author

ekluzek commented Apr 2, 2020

Testing on cheyenne and izumi has passed as expected. Clm45 and Fates cases compare the same as well as test cases with flexCN_FUN and noFUN_flexCN, since luna is off for those.

The following tests are also identical to the previous case even though Luna is on. The single point cases might just not see enough of a change, but I'm surprised about the f09, f10 and hcru ones.

  • ERS_D_Ld7_Mmpi-serial.1x1_smallvilleIA.IHistClm50BgcCropGs.cheyenne_intel.clm-decStart1851_noinitial
  • ERS_Ld5_Mmpi-serial.1x1_vancouverCAN.I1PtClm50SpRsGs.cheyenne_intel.clm-default
  • FUNITCTSM_P1x1.f10_f10_musgs.I2000Clm50SpGs.cheyenne_intel
  • PFS_Ld20.f09_g17.I2000Clm50BgcCrop.cheyenne_intel
  • SMS_D_Ld1_Mmpi-serial.1x1_mexicocityMEX.I1PtClm50SpRsGs.cheyenne_intel.clm-default
  • SMS_D_Ld1_Mmpi-serial.1x1_vancouverCAN.I1PtClm50SpRsGs.cheyenne_gnu.clm-default
  • SMS_D_Ld1_Mmpi-serial.1x1_vancouverCAN.I1PtClm50SpRsGs.cheyenne_intel.clm-default
  • SMS_D_Ld1_Mmpi-serial.f45_f45_mg37.I2000Clm50SpGs.cheyenne_gnu.clm-ptsRLA
  • SMS_D_Ld1_Mmpi-serial.f45_f45_mg37.I2000Clm50SpGs.cheyenne_intel.clm-ptsRLA
  • SMS_D_Ld5_Mmpi-serial.1x1_mexicocityMEX.I1PtClm50SpRsGs.cheyenne_intel.clm-default
  • SMS_D_Lm1_Mmpi-serial.CLM_USRDAT.I1PtClm50SpRsGs.cheyenne_intel.clm-USUMB
  • SMS_Ld1_Mmpi-serial.1x1_mexicocityMEX.I1PtClm50SpRsGs.cheyenne_intel.clm-default
  • SMS_Ld1_Mmpi-serial.f45_f45_mg37.I2000Clm50SpGs.cheyenne_gnu.clm-ptsRLA
  • SMS_Ld1_Mmpi-serial.f45_f45_mg37.I2000Clm50SpGs.cheyenne_gnu.clm-ptsRLB
  • SMS_Ld1_Mmpi-serial.f45_f45_mg37.I2000Clm50SpGs.cheyenne_gnu.clm-ptsROA
  • SMS_Ld1_Mmpi-serial.f45_f45_mg37.I2000Clm50SpGs.cheyenne_intel.clm-ptsRLA
  • SMS_Ly1_Mmpi-serial.1x1_vancouverCAN.I1PtClm50SpRsGs.cheyenne_gnu.clm-output_sp_highfreq
  • SMS_P720x1_Ln6.hcru_hcru.I2000Clm50BgcCruGs.cheyenne_intel.clm-coldStart
  • ERS_D_Ld5_Mmpi-serial.1x1_mexicocityMEX.I1PtClm50SpRsGs.izumi_nag.clm-default
  • SMS_D_Ld1_Mmpi-serial.1x1_vancouverCAN.I1PtClm50SpRsGs.izumi_nag.clm-default
  • SMS_D_Ld1_Mmpi-serial.f45_f45_mg37.I2000Clm50SpGs.izumi_nag.clm-ptsRLA
  • SMS_Ld1_Mmpi-serial.f45_f45_mg37.I2000Clm50SpGs.izumi_nag.clm-ptsRLA

@lmbirch89
Copy link
Contributor

@ekluzek I'm not surprised that some point simulations were not affected much as they don't appear to be in the high latitudes, but if the others are a global grid, I am surprised that it would be identical.

@ekluzek
Copy link
Collaborator Author

ekluzek commented Apr 3, 2020

Yes, exactly I find that odd as well. But, most of the global grid tests do show a difference in answers, so I suppose it's not necessarily a problem.

Bring documentation source to master

1. Bring documentation source to master: Pulls in the source from
   https://github.com/escomp/ctsm-docs. This is important so that
   documentation can remain in sync with changes in the model
   code. Images are stored here using git-lfs (Git Large File
   Storage). I also made some minor fixes to get the pdf build of the
   tech note working.

2. Use a different documentation theme that supports a version dropdown
   menu, and add the code needed to support this versioning on the
   documentation web pages. At a high level, the way the versioned
   documentation works is to have separate subdirectories in the
   gh-pages branch of the ctsm-docs repository for each version of the
   documentation we want to support. There is then a bit of JavaScript
   code which uses a json file in the gh-pages branch to determine which
   versions exist and how these should be named in the dropdown
   menu. Most of these changes were borrowed from ESMCI/cime#3439, which
   in turn borrowed from ESCOMP/CISM-wrapper#23, which in turn was a
   slight modification of an implementation provided by @mnlevy1981 for
   the MARBL documentation, which in turn borrowed from an
   implementation put together by Unidata (credit where credit is due).

   I am not aware of out-of-the-box support for a version pull-down in
   out-of-the-box sphinx themes (though the last time I looked was in
   Fall, 2018, so there may be something available now). However,
   support for a version dropdown exists in an open PR in the sphinx
   readthedocs theme repository: readthedocs/sphinx_rtd_theme#438. I
   have pushed this branch to a new repository in ESMCI
   (https://github.com/ESMCI/sphinx_rtd_theme) to facilitate long-term
   maintenance of this branch in case it disappears from the official
   sphinx_rtd_theme repository. I have also cherry-picked a commit onto
   that branch, which is needed to fix search functionality in sphinx1.8
   (from readthedocs/sphinx_rtd_theme#672) (which is another reason for
   maintaining our own copy of this branch). The branch in this
   repository is now named version-dropdown-with-fixes (branching off of
   the version-dropdown branch in the sphinx_rtd_theme repository). In
   the long-term, I am a little concerned about using this theme that
   isn't showing any signs of being merged to the main branch of the
   readthedocs theme, but this has been working for us in other projects
   for the last 2 years, so I feel this is a reasonable approach in the
   short-medium term.

The new process for building the documentation is given here:
https://github.com/ESCOMP/CTSM/wiki/Directions-for-editing-CLM-documentation-on-github-and-sphinx

Resolves ESCOMP#239
@ekluzek
Copy link
Collaborator Author

ekluzek commented Apr 9, 2020

I reran testing with the argument fix and things ran as expected other than this test that failed as follows...

SMS_Ld2_D.f09_g17.I1850Clm50BgcCropCmip6.cheyenne_intel.clm-basic_interp

36: size=33687990 rss=61878 share=18331 text=23647 datastack=0
0: size=33655620 rss=37175 share=10507 text=23647 datastack=0
0: memory_write: model date = 00010103 0 memory = 128211.89 MB (highwater) 141.62 MB (usage) (pe= 0 comps= ATM IAC ESP)
133: p2g_1d error: sumwt is greater than 1.0 at g= 1176
133: calling getglobalwrite with decomp_index= 1176 and clmlevel= gridcell
133: local gridcell index = 1176
133: global gridcell index = 19898
133: gridcell longitude = 271.250000000000
133: gridcell latitude = 72.0942408376963
133: ENDRUN:

But, everything else passed as expected.

@olyson
Copy link
Contributor

olyson commented Apr 9, 2020

Diagnostics for this bug fix are here:

http://webext.cgd.ucar.edu/I1850/clm50_ctsm10d089_2deg_GSWP3V1_luna_1850/lnd/clm50_ctsm10d089_2deg_GSWP3V1_luna_1850.51_80-clm50_ctsm10d089_2deg_GSWP3V1_1850.51_80/setsIndex.html

These are 1850 simulations that have been run for 80 years. Climatology is for the last 30 years. There are trends in some state variables in the LUNA simulation (e.g., TLAI, TOTSOMC), so I'm continuing these simulations a bit further.

@wwieder
Copy link
Contributor

wwieder commented Apr 9, 2020

Given these differences in GPP and NPP (> 8 & 3 Pg y-1, respectively) that are concentrated at high latitudes, I wouldn't expect soil C pools to come into equilibrium without going back to an AD spinup. @dlawrenncar an we still run transient 20th century simulations w/out the AD spinup and get meaningful results to compare? @lmbirch89 I know it's hard to say, but are the direction and magnitude of the changes seen here consistent with what you'd expect? Happy to help walk through diagnostics that @olyson sent, if you're not familiar with this fire hose of information.

@wwieder
Copy link
Contributor

wwieder commented Apr 9, 2020

These 'bugs' also seem like they'll have pretty significant climate changing answers, with a big drop Arctic LAI, ET and LH fluxes paired with higher sensible heat & runoff. @Oleson, do you have pft level output on so we can check for survival?

@lmbirch89
Copy link
Contributor

lmbirch89 commented Apr 9, 2020

@wwieder These are definitely a lot of figures! I think I followed them though. The red lines are the bug fixes to LUNA, right? The decrease in productivity is expected because the day fraction before used to be larger than 1 at high latitudes, which was increasing productivity artificially by inflating Jmax. It's consistent with everything I was simulating.

@wwieder
Copy link
Contributor

wwieder commented Apr 9, 2020

One more quesion on spinup- the transient effect of reduced productivity and SOMC losses means that high latitude ecosystems are flush with N that causes big drops in plant stiochiometry. This makes me less sure that these will be meaningful in a transient run w/o an ADspinup.

@wwieder
Copy link
Contributor

wwieder commented Apr 9, 2020

@olyson let's hold off on running more simulations for a bit. This initialization gives us enough information to know that the bug fixes are really answer changing and we need to learn a bit more before making a plan about how to move forward to maintain backwards capabilities for CLM5 physics.

@wwieder
Copy link
Contributor

wwieder commented Apr 9, 2020

The third potential LUNA bug identified was initial Vcmax and Jmax values used when deciduous plants put on leaves. The code in question is here

@lmbirch89 says the model is very sensitive to the values of these initial conditions and is suggesting we use the accumulator to save values the previous year, as opposed to the hard coded values currently being used (see #947). I've asked @xuchongang to comment, but do we want to bring these initialization values for Vcmax and Jmax in as a bug fix or the other arctic C cycle updates?

…m the previous year is normally used, unless it's a restart rather than hard-coded values addresses ESCOMP#981
Copy link
Contributor

@wwieder wwieder left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor changes suggested here, @ekluzek

enzs_z => photosyns_inst%enzs_z_patch & ! Output: [real(r8) (:,:) ] enzyme decay status 1.0-fully active; 0-all decayed during stress
enzs_z => photosyns_inst%enzs_z_patch , & ! Output: [real(r8) (:,:) ] enzyme decay status 1.0-fully active; 0-all decayed during stress
vcmx_prevyr => photosyns_inst%vcmx_prevyr , & ! Output: [real(r8) (:,:) ] patch leaf Vc,max25 from previous year avg
jmx_prevyr => photosyns_inst%jmx_prevyr & ! Output: [real(r8) (:,:) ] patch leaf Jmax25 from previous year avg
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comments here should just not be the previous year avg, but the values from the end of the growing season for the previous year. @billsacks did you have thoughts about how to name this variable in the first place?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My main thought was to avoid having "yr" / "year" in the name. I suggested vcmx_last_active_day, but I'm not sure if that's the best. @ekluzek suggested something including "last_valid", which also seems reasonable. Good naming is so hard... (and yet so important).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Another note about these names: they need _patch suffixes. Also, it's probably best if they maintain the 25_z that appears in vcmx25_z and jmx25_z variable names, to show the correspondence between these variables.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We'll fix this in #1060

@@ -183,6 +183,8 @@ module PhotosynthesisMod
! LUNA specific variables
real(r8), pointer, public :: vcmx25_z_patch (:,:) ! patch leaf Vc,max25 (umol CO2/m**2/s) for canopy layer
real(r8), pointer, public :: jmx25_z_patch (:,:) ! patch leaf Jmax25 (umol electron/m**2/s) for canopy layer
real(r8), pointer, public :: vcmx_prevyr (:,:) ! patch leaf Vc,max25 previous year running avg
real(r8), pointer, public :: jmx_prevyr (:,:) ! patch leaf Jmax25 previous year running avg
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As in LunaMod, this comment is misleading since we're not taking the running average. We should be consistent with LunaMod.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We'll fix this in #1060

@@ -328,6 +330,8 @@ subroutine InitAllocate(this, bounds)
! statements.
allocate(this%vcmx25_z_patch (begp:endp,1:nlevcan)) ; this%vcmx25_z_patch (:,:) = 30._r8
allocate(this%jmx25_z_patch (begp:endp,1:nlevcan)) ; this%jmx25_z_patch (:,:) = 60._r8
allocate(this%vcmx_prevyr (begp:endp,1:nlevcan)) ; this%vcmx_prevyr (:,:) = 85._r8
allocate(this%jmx_prevyr (begp:endp,1:nlevcan)) ; this%jmx_prevyr (:,:) = 50._r8
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hopefully doesn't really matter, but the hard coded values we're replacing in LUNA are were for vcmx25 = 50 (not 85) & jmx25 = 85 (not 50)
In lines 331-2 of photosynthesisMod vcmx25 and jmx25 are initialized to 30 & 60, respectively. Either one of these seem preferable, but not sure which is better?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for raising this point here @wwieder . In addition to cold start, times I can think of when these initial values will matter are (1) in the first year of existence of a crop that newly appears in a grid cell in a transient case; and (2) in the first year after restart for cases using old restart files, including all cases run from master until we update our out-of-the-box finidat files. So, while these values may not be hugely important, they seem more important than your typical cold start initialization value.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See also comment from @lmbirch89: #947 (comment)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since, our simulations have been run with these values -- we'll need to get the correct values on #947.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We'll fix this for #1107

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is coming in with my PR #1102

call restartvar(ncid=ncid, flag=flag, varname='jmx_prevyr', xtype=ncd_double, &
dim1name='pft', dim2name='levcan', switchdim=.true., &
long_name='avg carboxylation rate at 25 celsius for canopy layers', units='umol CO2/m**2/s', &
interpinic_flag='interp', readvar=readvar, data=this%jmx_prevyr)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

long names here should match what we're using in the description above, something about the previous year?

Minor correction, but jmx25_z_patch and jmx_prevyr long_name should be maximum rate of electron transport at 25 celsius for canopy layers, with units umol electron/m**2/s

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We'll fix this in #1060

@ekluzek
Copy link
Collaborator Author

ekluzek commented Apr 15, 2020

Good suggestions @wwieder. I just brought in exactly what was on @lmbirch89 branch. These changes shouldn't change answers, so I'm going to wait on doing them until after we are sure this is what we want to bring in. So I'll wait for @olyson to run testing on them. Then I can make sure these updates don't change answers for sure.

@billsacks
Copy link
Member

billsacks commented Apr 15, 2020

These changes shouldn't change answers

@ekluzek - note that @wwieder 's question about the initialization values could change answers, though hopefully wouldn't change things too dramatically.

@wwieder
Copy link
Contributor

wwieder commented Apr 15, 2020

@wwieder
Copy link
Contributor

wwieder commented Apr 16, 2020

@olyson put up diagnostics for

Specifically @lmbirch89 there doesn't seem to be much of an effect on the initial conditions for photosynthetic capacity. I'd expected some shift in GPP early in the growing season, but it's not obvious. Is this consistent with your findings?

@lmbirch89
Copy link
Contributor

@wwieder Yes, I found that initial conditions didn't matter too much if you are running the model for many years. I think the initialization only matters if you are analyzing less than a year of output.

@wwieder
Copy link
Contributor

wwieder commented Apr 24, 2020

@olyson ran clean 20th century runs with bug fixes

There are some regular diagnostics for the LUNA bug fixes compared to control here.

and ILAMB results here.

A few observations:

  • Regional heterogeneity of improvement / degradation are pretty striking (better GPP over boreal forests of North America and worse over boreal forests of Eurasia).

  • The phenology of LAI is pretty woeful across high latitudes, so presumably @lmbirch89 changes to phenology and allocation will improve things for CTSM5.1.

  • As @dlawrenncar notes, it does present challenges for using 5.0 physics with the LUNAbugFixes Fixes for the LUNA dayl bugs #961.

  • This makes me wonder if we should look at SP simulations to evaluate the effect of the LUNA bugs on GPP without downstream effects on biogeochemistry? Not sure if we'd learn that much more beyond, GPP is lower with the LUNA bugFixes

  • Finally, I'm not really sure why GPP is lower in the Eastern US with the bugFixes, but it makes me wonder how much we'll need to reparameterize the model in non-Arctic pfts?

@wwieder wwieder added this to the ctsm5.1.0 milestone May 21, 2020
@wwieder wwieder linked an issue May 26, 2020 that may be closed by this pull request
@ekluzek
Copy link
Collaborator Author

ekluzek commented Jun 3, 2020

@olyson ran a simulation with this branch and these two namelist changes:

jmaxb1 = 0.17
paramfile = '/glade/p/cgd/tss/people/oleson/modify_param/clm5_params.c200402.slatopA_leafcnA.nc'

for case clm50_ctsm10d089_2deg_GSWP3V1_luna3_jmaxb1-0.17_slatopA_leafcnA_hist. It sounds like we want to bring this in as the updated CLM5.0.

The changes to the paramsfile are to leafcn, leafcn_max, leafcn_min, slatop

RMS leafcn 8.5037E-01 NORMALIZED 3.8059E-02
RMS leafcn_max 6.3095E-01 NORMALIZED 5.6833E-02
RMS leafcn_min 6.3095E-01 NORMALIZED 1.2554E-01
RMS slatop 2.1862E-03 NORMALIZED 6.4128E-02

@wwieder
Copy link
Contributor

wwieder commented Jun 3, 2020

I'm not sure what the text pasted below means @ekluzek, but yes, we want to bring the bug fixes to master with those two namelist changes from @olyson.


RMS leafcn 8.5037E-01 NORMALIZED 3.8059E-02
RMS leafcn_max 6.3095E-01 NORMALIZED 5.6833E-02
RMS leafcn_min 6.3095E-01 NORMALIZED 1.2554E-01
RMS slatop 2.1862E-03 NORMALIZED 6.4128E-02

@ekluzek
Copy link
Collaborator Author

ekluzek commented Jun 3, 2020

The paramsfile changes are to index 11 and 12 which are broadleaf_deciduous_boreal_shrub and c3_arctic_grass. Everything else is the same.

Changes 11 and 12 of slatop from: 0.03072, 0.04024 to 0.028, 0.021
11 and 12 of leafcn_min from 15 to 11.4, 10.7
11 and 12 of leafcn_max from 35 to 31.4, 30.7
11 and 12 of leafcn from 23.2558139534884, 28.0269058295964 to 21.4, 20.7

@wwieder
Copy link
Contributor

wwieder commented Jun 3, 2020 via email

Add support for new GFDL fv3 curbed sphere grids. Low resolution C24, and C48
Moderate resolution C96, and high resolution C192 and C384. Add in mapping files
so all can be created. Make C96 a default resolution created with new surface
datasets. For all of these only the crop versions of surface and landuse.timeseries
are in place, as non-crop can run in CLM without needing different datasets. Also for
historical landuse.timeseries we use the SSP5-8.5 timeseries files, so that it can be
used for future, historical and beyond 2015 present day.
@ekluzek
Copy link
Collaborator Author

ekluzek commented Jun 25, 2020

All tests have passed as expected, and the clm4_5 answers don't change as expected.

@ekluzek ekluzek added PR status: ready PR: this is ready to merge in, with all tests satisfactory and reviews complete and removed PR status: work in progress labels Jun 25, 2020
@ekluzek ekluzek merged commit a3e738c into ESCOMP:master Jun 26, 2020
@ekluzek ekluzek deleted the lunadaylbugs branch June 26, 2020 07:33
negin513 added a commit to ekluzek/CTSM that referenced this pull request Sep 2, 2020
@samsrabin samsrabin added bug something is working incorrectly science Enhancement to or bug impacting science labels Aug 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug something is working incorrectly PR status: ready PR: this is ready to merge in, with all tests satisfactory and reviews complete science Enhancement to or bug impacting science
Development

Successfully merging this pull request may close these issues.

Luna Bug #3, hard coded initial conditions LUNA day length factor Incorrect formula in LUNA
6 participants