Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Apparent small bug impacting lateral boundary processing of winds #169

Closed
MatthewPyle-NOAA opened this issue Jan 18, 2022 · 8 comments
Closed

Comments

@MatthewPyle-NOAA
Copy link
Contributor

Describe the bug
Using the relatively new pressure tendency diagnostic, we’ve noticed a slight, transient increase in domain-averaged surface pressure tendency at the frequency of lateral boundary updates in limited-area runs. This behavior was traced to the definition of reg_bc_update_time in dyn_core.F90 routine prior to regional_boundary_update calls for the w/u/v variables.

The increased noise seen with the original line:

reg_bc_update_time=current_time_in_seconds+bdt*(n_map-1)+it*dt

goes away when reg_bc_update_time is made more similar to other definitions of this variable:

reg_bc_update_time=current_time_in_seconds+bdt*(n_map-1)+(it-1)*dt

It also seems like this reg_bc_update_time variable is being defined in a suboptimal portion of the code, and should be shifted out of the #ifndef SW_DYNAMICS block surrounding the 'w' update.

To Reproduce
Run the latest UFS weather model with print_diff_pgr = .true. within &gfs_physics_nml

Expected behavior
No increase in noise around boundary update times. A comparison without and with the proposed modification.

System Environment
Should be system independent.

Additional context
Slides 1-5 of this presentation provides a bit more detail

@lharris4
Copy link
Contributor

Hi, Matt. Thank you for your very nice bug report.

After discussion with @kaiyuan-cheng I believe there is a bug in the regional updating, but this would not fix it. All of the BC calls made in dyn_core() after the forward-in-time update d_sw() need to be made at the it timelevel, not the it-1 timelevel. Currently the BCs for the pressure variables (assigned by nh_bc()) and u,v,w use the right timelevel:

reg_bc_update_time=current_time_in_seconds+bdt*(n_map-1)+it*dt

#ifndef SW_DYNAMICS
if (.not. hydrostatic) then
call nested_grid_BC_apply_intT(w, &
0, 0, npx, npy, npz, bd, split_timestep_BC+1, real(n_split*flagstruct%k_split), &
neststruct%w_BC, bctype=neststruct%nestbctype )
end if
#endif SW_DYNAMICS
call nested_grid_BC_apply_intT(u, &
0, 1, npx, npy, npz, bd, split_timestep_BC+1, real(n_split*flagstruct%k_split), &
neststruct%u_BC, bctype=neststruct%nestbctype )
call nested_grid_BC_apply_intT(v, &
1, 0, npx, npy, npz, bd, split_timestep_BC+1, real(n_split*flagstruct%k_split), &
neststruct%v_BC, bctype=neststruct%nestbctype )
end if

whereas pt, delp, and q_con do not use the right timelevel:
if (flagstruct%regional) then
reg_bc_update_time=current_time_in_seconds+bdt*(n_map-1)+(it-1)*dt
call regional_boundary_update(delp, 'delp', &
isd, ied, jsd, jed, npz, &
is, ie, js, je, &
isd, ied, jsd, jed, &
reg_bc_update_time )
#ifndef SW_DYNAMICS
call regional_boundary_update(pt, 'pt', &
isd, ied, jsd, jed, npz, &
is, ie, js, je, &
isd, ied, jsd, jed, &
reg_bc_update_time )
#ifdef USE_COND
call regional_boundary_update(q_con, 'q_con', &
isd, ied, jsd, jed, npz, &
is, ie, js, je, &
isd, ied, jsd, jed, &
reg_bc_update_time )

Compare to the nested grid code, which uses the right timelevel for all of these variables. A future update to the code should re-write the regional BC code to use the same routines as the nested-grid BCs.

It is true that the reg_bc_update_time assignment needs to be made outside of the #ifndef block, and also outside of the if (.not. hydrostatic) block.

@MatthewPyle-NOAA
Copy link
Contributor Author

Thanks for the feedback. I'm sure you are correct about what it should be from a science perspective. But the it*dt term for u and v winds fools the logic in the boundary code for the final acoustic step before fresh boundary information is ingested (believes it is at the beginning of the period rather than at the end). I had an earlier fix for the issue that would adjust the fraction_interval variable if it had been reset for an acoustic step > 1. This approach touched more routines as "it" needed to be added as a subroutine argument in various places for it to be available down in the regional boundary processing code. But would this approach be a more viable fix from your perspective?

  if (fraction_interval .eq. 0.0 .and. it .gt. 1) then
    fraction_interval=1.0
  endif

@lharris4
Copy link
Contributor

lharris4 commented Jan 19, 2022 via email

@MatthewPyle-NOAA
Copy link
Contributor Author

Hi Lucas,

The actual fix can be made in the regional BC module (the conditional redefinition of fraction_interval in my comment above), but it requires knowing the value of the acoustic timestep (it) down in that code. I got it down there by adding it as an argument passed in the subroutine call to regional_boundary_update. Several source codes have regional_boundary_update calls, so superficial changes are made in several places to support fixing it in this way. Perhaps there is a more elegant way to achieve the same result.

-Matt

@lharris4
Copy link
Contributor

lharris4 commented Jan 24, 2022 via email

@MatthewPyle-NOAA
Copy link
Contributor Author

Hi Lucas,

So far I've been working to add these changes on top of the "main" branch code in my fork. Is that the proper approach? Thus far I’ve been struggling to get this atmos_cubed_sphere code compiled within the latest UFS weather model, but I’m a bit out of my element. Just want to make sure I shouldn’t be trying to work the change into “gfdl_test” or some other branch of atmos_cubed_sphere.

Thanks,

Matt

@bensonr
Copy link
Contributor

bensonr commented Jan 25, 2022

@MatthewPyle-NOAA - if you are working within the UFS, the dev/emc branch should be your target. We will synchronize and cherry-pick from branch to branch as needed.

@MatthewPyle-NOAA
Copy link
Contributor Author

Thanks for the quick response, Rusty. I will pivot to dev/emc.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants