Possible bug in wetdry.F for nested apps

Bug reports, work arounds and fixes

Moderators: arango, robertson

Post Reply
Message
Author
stef
Posts: 192
Joined: Tue Mar 13, 2007 6:38 pm
Location: Independent researcher
Contact:

Possible bug in wetdry.F for nested apps

#1 Unread post by stef »

Hi,

I read wetdry.F, and was wondering if the index boundaries should account for the contact regions in nested applications? Currently, there are ranges like "JstrR,JendR" etc., but shouldn't they be "JstrT,JendT" instead?

Thanks!

Regards, Stefan

User avatar
arango
Site Admin
Posts: 1367
Joined: Wed Feb 26, 2003 4:41 pm
Location: DMCS, Rutgers University
Contact:

Re: Possible bug in wetdry.F for nested apps

#2 Unread post by arango »

Hmm. I will have to think about it but maybe it will be needed. It will need to be tested. There are a set of those indices for RHO-, PSI-, U-, and V-points. It will be only relevant it coastlines are present in the contact points.

jcwarner
Posts: 1200
Joined: Wed Dec 31, 2003 6:16 pm
Location: USGS, USA

Re: Possible bug in wetdry.F for nested apps

#3 Unread post by jcwarner »

we made some updates in those exchanges as well, but it had to deal with the masking that was used. Either Dave or I will send you the changes in a bit. we have some applications that we are working on first to test this out.
-j

stef
Posts: 192
Joined: Tue Mar 13, 2007 6:38 pm
Location: Independent researcher
Contact:

Re: Possible bug in wetdry.F for nested apps

#4 Unread post by stef »

Thanks for your replies!

I work on a 2d application, and another potentially relevant thing I just noticed is that the nesting boundary conditions for the refined grid use fields called e.g.

Code: Select all

REFINED(cr)%DU_avg2(1,m,tnew)
I think this is the case even in 2d applications. However, I think that this field is never calculated in nesting.F for 2d applications. Could this be true? I only checked for the Rutgers code, maybe you have already changed this in COAWST?

I haven't tested it, but it could explain why I always have zero influx of u/vbar at my refined (open) boundaries. Looking into this was actually the reason why I stumbled across the wet-mask issue. I printed out GRID(ng)%umask_wet(i,j) in subroutine put_refine2d(), and it was always zero. After modifying the index ranges in wetdry.F, the umask_wet was "2" for wet-wet neighbours, as it's supposed to be (disregarding the fact that in that subroutine, there should perhaps only appear binary masks, but that's a different problem and I think I can work around this....). However, when I checked the history file, it *still* was zero flux at the boundary. The only trace I found so far is the boundary condition REFINED(cr)%DU_avg2(1,m,tnew).

Let me know if this makes any sense.

Regards, Stefan

User avatar
arango
Site Admin
Posts: 1367
Joined: Wed Feb 26, 2003 4:41 pm
Location: DMCS, Rutgers University
Contact:

Re: Possible bug in wetdry.F for nested apps

#5 Unread post by arango »

Notice that the indices told and tnew are local (dummy) time indices:

Code: Select all

!
!  Set time snapshot indices for the donor grid data.
!
      told=3-RollingIndex(cr)
      tnew=RollingIndex(cr)
However, if you are running a 2D shallow-water application (SOLVE3D is off), why are you dealing with state variables (DU_avg1, DU_avg2, DV_avg1, DV_avg2, Zt_avg1) that are only need in 3D applications? Something is very wrong here. I have never run nesting in a 2D application. If this is the case, some SOLVE3D conditionals are missing in nesting.F.

stef
Posts: 192
Joined: Tue Mar 13, 2007 6:38 pm
Location: Independent researcher
Contact:

Re: Possible bug in wetdry.F for nested apps

#6 Unread post by stef »

I see, thanks. That DU_Avg field is actually appearing in u2dbc_im.F, for the "nested" type boundary conditions (in a 2D application). I substituted it there by a dummy value to test, and then the dummy value appears in my history file. Unfortunately I can only continue testing on Monday. Anyways, thanks for looking into it!

User avatar
arango
Site Admin
Posts: 1367
Joined: Wed Feb 26, 2003 4:41 pm
Location: DMCS, Rutgers University
Contact:

Re: Possible bug in wetdry.F for nested apps

#7 Unread post by arango »

Well, u2dbc_im.F and v2dbc_im.F need to be modified to include code relevant to 2D shallow-water nesting applications. I guess that we never accounted for a User needing nesting in a 2D application. It will need an equation to impose the transport from the donor coarser grid. We will have to figure out what value to load into bry_val.

stef
Posts: 192
Joined: Tue Mar 13, 2007 6:38 pm
Location: Independent researcher
Contact:

Re: Possible bug in wetdry.F for nested apps

#8 Unread post by stef »

I guess that we never accounted for a User needing nesting in a 2D application.
The motivation would be to do storm surge modelling, in a way that is roughly consistent with other storm surge models such as SLOSH or ADCIRC (both mostly 2d, at least until a couple of years ago...I'm not sure about the newest developments). 3D is great, but having 2D for comparison would be very interesting. Please let me know if you disagree?
I guess that we never accounted for a User needing nesting in a 2D application. It will need an equation to impose the transport from the donor coarser grid. We will have to figure out what value to load into bry_val.
In the coming weeks I will try to better understand the algorithm, and the hints you gave are very helpful. I read the Wiki today and think that the DU/V_avg2 is the average of the barotropic flux over the entire baroclinic timestep, weighted with the secondary weights (centered roughly in between the baroclinic step). In the refinement nesting, these are imposed at the finer grid physical boundaries to have agreeing mass fluxes between donor and receiver. I think that agrees with how I understand the Nesting-wiki when it says:
In grid refinement applications, the information is exchanged at the beginning of the full time-step (top of main2d or main3d). Contrarily, in composite grid applications, the information is exchanged between each sub-time step call in main2d or main3d. That is, the composite donor and receiver grids need to sub-time step the 2D momentum equations before any of them start solving and coupling the 3D momentum equations. Therefore, the governing equations are solved and nested in a synchronous fashion.
So maybe in a 2d setup, one could store the volume flux of the donor after each barotropic corrector step, and impose that during the 3 (or 5 etc...) time steps made in the receiver grid?

stef
Posts: 192
Joined: Tue Mar 13, 2007 6:38 pm
Location: Independent researcher
Contact:

Re: Possible bug in wetdry.F for nested apps

#9 Unread post by stef »

So maybe in a 2d setup, one could store the volume flux of the donor after each barotropic corrector step, and impose that during the 3 (or 5 etc...) time steps made in the receiver grid?
Hmm, I guess one should store two time levels of the donor volume flux along the boundary, and then do a temporal interpolation. Applying only a single time level may be inconsistent with the temporal interpolations of zeta and u/vbar in put_refine() ? Does this make any sense physically?

So one could store two consecutive DUon/DVon at the end of step2d.F for the coarse grid, store them in get_persisted2() in the REFINED structure, and temporally interpolate in u/vdbc_im.F. Then, in put_refine(), one would have to prevent overwriting u/vbar at the physical fine-grid boundary (similar to what is done for the 3D case).

stef
Posts: 192
Joined: Tue Mar 13, 2007 6:38 pm
Location: Independent researcher
Contact:

Re: Possible bug in wetdry.F for nested apps

#10 Unread post by stef »

I wrote a blog post [1] about my experience using ROMS in 2d setups with grid nesting and wetting/drying. Very early stage, but if anybody is interested in this, I appreciate comments/suggestions, in particular if you are working on new code that is being pushed to the repository in the near future.

Currently I'm doing the following:

1) Updated array bounds in the file wetdry.F to include any contact regions.

2) New masks for ubar and vbar. I simply store the coefficient cff7 used to mask u/vbar in step2d_LF_AM3.h. The new masks are submitted as arguments to the function fine2coarse2d(), instead of the previous u/vmask_full. The new masks also serve for the computation of the interpolation weights used in populating the child grid with parent grid information.

3) Don't apply interpolation onto the coarse grid where the coarse grid is dry.

4) The child grids are forced at the boundary with the latest instantaneous barotropic volume flux of the parent grid (first option of my previous post.

As a test I'm trying to simulate tidal flow in the East River, NY, by scaling down from a large grid (12 km)

Image

to a small grid of the East River (32 m, cropped Hudson/Harlem rivers for preliminary testing)

Image

It was important for me to use realistic (although severely cropped) bathymetry, so I get to know the wetting/drying algorithm better.

More details on researchgate:

[1] https://www.researchgate.net/publicatio ... tingdrying
Last edited by stef on Thu Oct 24, 2024 2:11 pm, edited 1 time in total.

jcwarner
Posts: 1200
Joined: Wed Dec 31, 2003 6:16 pm
Location: USGS, USA

Re: Possible bug in wetdry.F for nested apps

#11 Unread post by jcwarner »

this looks cool! i looked at your paper:
https://sriha.net/articles/storm-surge- ... ing-drying
and you are really digging into the fine details. dont think many people have cascaded down from km to m scales. nice to see.

User avatar
arango
Site Admin
Posts: 1367
Joined: Wed Feb 26, 2003 4:41 pm
Location: DMCS, Rutgers University
Contact:

Re: Possible bug in wetdry.F for nested apps

#12 Unread post by arango »

Great work, Stefan. I concur with john about your telescoping approach into meter scales. It will take me some time to digest the changes that you make to the code. If you provide us the modifications that you have to do for the shallow-water nested applications to work, we can incorporate such changes to the code. The wetting and drying are problematic since we don't have a way to implement it from first principles in the governing equations. Therefore, we need to follow an engineering approach to implement it numerically in ROMS. I know that John and colleagues are working on a more robust approach to wetting and drying.

stef
Posts: 192
Joined: Tue Mar 13, 2007 6:38 pm
Location: Independent researcher
Contact:

Re: Possible bug in wetdry.F for nested apps

#13 Unread post by stef »

Hi,

Thanks for your email, I'll respond tomorrow, meanwhile I attached the patch. Please note:

*) Some (much?) of this may be incorrect. I'm a beginner. My approach was "if I get a reasonable result, it's good enough for now".
*) I have not really understood MPI yet, in particular, in the end of step2d I'm doing things I'm not sure are right. I just copied some patterns I saw in your code.
*) I did not fix the possible "bug" reported here:
viewtopic.php?f=14&t=5438&p=21072#p21072
I did fix it in serial mode, but then realized that what I did, did not work in parallel mode. If I remember correctly, the coarse-grid boundary data was not available from all tiles needed in the fine grid. There was always a little segment that wasn't properly processed. Instead I modified the fine grid to be identical to the coarse-grid topography at the boundary (visible in the figures above), which is perhaps a prudent thing to do anyways.
*) The patch is with respect to git commit 'c23f66eb118e', or "src:ticket:826".
mypatch.txt
(58.65 KiB) Downloaded 449 times

stef
Posts: 192
Joined: Tue Mar 13, 2007 6:38 pm
Location: Independent researcher
Contact:

Re: Possible bug in wetdry.F for nested apps

#14 Unread post by stef »

I should say that the "mod_stef.F" is very messy. Sorry. :roll: The only variables needed are the ones used in step2d.

stef
Posts: 192
Joined: Tue Mar 13, 2007 6:38 pm
Location: Independent researcher
Contact:

Re: Possible bug in wetdry.F for nested apps

#15 Unread post by stef »

Update: note that there is a problem with the patch in that the wet/dry flux limiter (i.e. suppress outflow from a dry cell) does not work at the boundary. An already dry cell adjacent to the boundary may still export water out of the domain, depending on the boundary condition. It could be fixed simply by applying the fine grid wetmask, but then the volume fuxes between the coarse and fine grids don't agree anymore

Post Reply