Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

init_interp does not work for CNDV or FATES, or any future internally-generated land-cover dynamics #76

Open
ekluzek opened this issue Dec 16, 2017 · 8 comments
Labels
enhancement new capability or improved behavior of existing capability

Comments

@ekluzek
Copy link
Collaborator

ekluzek commented Dec 16, 2017

Sam Levis < slevis > - 2010-03-18 11:33:09 -0600
Bugzilla Id: 1127
Bugzilla Depends: 1303,
Bugzilla CC: andre, dlawren, erik, rfisher, sacks,

Interpinic has not worked for the old dgvm since probably before clm3.5. Interpinic has not been tested, yet, for CNDV. Therefore, we assume that it does not work.

With the clm4 we will supply spun up initial conditions for CNDV for year 2000, 2 degree simulations. Users will need to complete their own spin ups for other CNDV simulations.

We need to convey the above info in the clm4 user's guide.

@ekluzek ekluzek added this to the future milestone Dec 16, 2017
@ekluzek
Copy link
Collaborator Author

ekluzek commented Dec 16, 2017

Erik Kluzek < erik > - 2011-03-02 13:01:29 -0700

After discussing this with Sam, we think the issue with interpinic is that it will need to map the pft weights. Whereas the way interpinic works now is to only map that for which the pft weights are non-zero and leave the pft weights as they are. What will need to happen is that the pft weights will need to be mapped as well as the other variables.

@ekluzek
Copy link
Collaborator Author

ekluzek commented Dec 16, 2017

Sam Levis < slevis > - 2012-06-25 13:07:59 -0600

I post this info in case we decide that want to pursue this fix.

Guiling Wang (UConn) got interpinic to work with CNDV. She sent me her interpinic.F90 and BiogeophysRestMod.F90 with the following email dated 5/30/2012, which I have filed in my /clm/interpinic email folder:

Hi Sam,

This is the interpinic.F90 I modified. In addition to the changes to make sure that all 17 PFTs at an individual grid cell are from the same grid cell, I also needed to take several variables out of the list of cycled variables. I therefore need the corresponding change in BiogeophysRestMod.F90 (the segment of the code with "EaSM" labelled). I think it is good to keep these changes in BiogeophysRestMod.F90 for future releases:
(1) Otherwise the model does not work if PFT_ variables are not cycled in interpinic. In this case the changes are absolutely necessary.
(2) If the PFT_variables are cycled, then the changes I added to BiogeophysRestMod won't make any difference.
So the added portion works either way.

FYI, most of the changes are indicated with EaSM, either at the end of the line, or at the beginning and end of a segment of code.

Please let me know if you spot any problem.

Thanks,
Guiling

@ekluzek
Copy link
Collaborator Author

ekluzek commented Dec 16, 2017

Bill Sacks < sacks > - 2015-09-18 11:46:50 -0600

In principle, this same problem would apply to any aspect of dynamic landunits / columns / patches that is generated internally by CLM. So far, I believe that would just apply to ED: all other aspects of the dynamics are either read from file (transient crops & PFTs) or come from another component (dynamic glacier area).

@ekluzek
Copy link
Collaborator Author

ekluzek commented Dec 16, 2017

Bill Sacks < sacks > - 2015-09-18 11:48:48 -0600

I also noticed that the allPFTSfromSameGC flag that used to be present in interpinic is no longer present in the clm4.5 and later initInterp. I believe that flag related to the operation of CNDV. So something like that may need to be brought back if we want interpinic to work for CNDV.

@ekluzek
Copy link
Collaborator Author

ekluzek commented Dec 16, 2017

Bill Sacks < sacks > - 2016-11-04 13:49:10 -0600

This is also somewhat of an issue for glaciers, for which the area comes from GLC, since the fields from GLC aren't available at initialization.

I'm working around that issue with some special-purpose code for glaciers at the start of the driver run loop, like this (in clm_driver.F90):

! ========================================================================
! In the first time step of a run that used cold start or init_interp, glacier areas
! will start at whatever is specified on the surface dataset, because coupling fields
! from GLC aren't received until the run loop. Thus, CLM will see a potentially large,
! fictitious glacier area change in the first time step after cold start or
! init_interp. We don't want this fictitious area change to result in any state or
! flux adjustments. Thus, we apply this area change here, at the start of the driver
! loop, so that in dynSubgrid_driver, it will look like there is no glacier area
! change in the first time step.
!
! This needs to happen very early in the run loop, before any balance checks are
! initialized, because - by design - this doesn't conserve mass at the grid cell
! level. (The whole point of this code block is that we adjust areas without doing
! the typical state or flux adjustments that need to accompany those area changes for
! conservation.)
!
! This accomplishes approximately the same effect that we would get if we were able to
! update glacier areas in initialization. The one difference - and minor, theoretical
! problem - that could arise from this start-of-run-loop update is: If the first time
! step of the CESM run loop looked like: (1) GLC runs and updates glacier area (i.e.,
! glacier area changes in the first time step compared with what was set in
! initialization); (2) coupler passes new glacier area to CLM; (3) CLM runs. Then the
! code here would mean that the true change in glacier area between initialization and
! the first time step would be ignored as far as state and flux adjustments are
! concerned. But this is unlikely to be an issue in practice: Currently GLC doesn't
! update this frequently, and even if it did, the change in glacier area in a single
! time step would typically be very small.
!
! If we are ever able to change the CESM initialization sequence so that GLC fields
! are passed to CLM in initialization, then this code block can be removed.
! ========================================================================

need_glacier_initialization = (is_first_step() .and. &
     (is_cold_start .or. is_interpolated_start))

if (create_glacier_mec_landunit .and. need_glacier_initialization) then
   !$OMP PARALLEL DO PRIVATE (nc, bounds_clump)
   do nc = 1, nclumps
      call get_clump_bounds(nc, bounds_clump)

      call glc2lnd_inst%update_glc2lnd_non_topo( &
           bounds = bounds_clump, &
           glc_behavior = glc_behavior)

      call dynSubgrid_wrapup_weight_changes(bounds_clump, glc_behavior)

   end do
   !$OMP END PARALLEL DO
end if

Ideally, though, we'd either (a) get glacier areas from GLC -> CLM in initialization, or (b) (relevant to this bug report) interpolate these glacier areas in init_interp. If we did (b) then I could probably remove the check for is_interpolated_start in the above code.

@ekluzek
Copy link
Collaborator Author

ekluzek commented Dec 16, 2017

Bill Sacks < sacks > - 2016-11-04 13:50:25 -0600

However, regarding comment 7 (now #76 (comment)): It's still possible that we'd want to avoid doing the state / flux adjustments in the first time step after init_interp, because it may be the case that we often still have a large (fictitious) change in glacier area in that first time step. This would need more thought.

@billsacks billsacks changed the title init_interp does not work for CNDV or ED, or any future internally-generated land-cover dynamics init_interp does not work for CNDV or FATES, or any future internally-generated land-cover dynamics Jan 16, 2018
@ESCOMP ESCOMP deleted a comment from ekluzek Feb 28, 2018
@ESCOMP ESCOMP deleted a comment from ekluzek Feb 28, 2018
@billsacks
Copy link
Member

Regarding #76 (comment) -- see #340 . Summary: I plan to change that logic to avoid doing the dynamic landunit adjustment fluxes in the first time step of any run.

@billsacks
Copy link
Member

Fixing #346 will partially address this issue. But there will still be a need for something more general that would handle interpolating to a different grid: Note that #346 proposes to find a point at the exact same grid cell. I thought about generalizing it to find the closest gridcell and copying everything (including subgrid areas) from there, but that won't work in general, because the target output point may not contain all of the landunits/columns/patches from the closest input point, so areas won't sum to 1. So I'm currently thinking that we'll need separate modes of operation for different use cases - such as the use case of handling dynamic vegetation. Alternatively, maybe we need a fundamentally different scheme to handle this more generally. I'm not entirely sure....

@billsacks billsacks added the enhancement new capability or improved behavior of existing capability label Mar 13, 2019
@ekluzek ekluzek removed this from the future milestone Aug 12, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement new capability or improved behavior of existing capability
Projects
Status: No status
Development

No branches or pull requests

2 participants