Possibly 3 nesting-related bugs

Bug reports, work arounds and fixes

Moderators: arango, robertson

Post Reply
Message
Author
cae
Posts: 36
Joined: Mon Jun 30, 2003 5:29 pm
Location: UC Santa Cruz

Possibly 3 nesting-related bugs

#1 Unread post by cae »

In ROMS revision 667, I found what may be 2 bugs in contact.m and one in inp_par.F. Apologies if these are correct as they are.

1) Lines 1608 and 1664 in contact.m:

Code: Select all

C.mask_v   = G(rg).mask_u(IN);
should be

Code: Select all

C.mask_v   = G(rg).mask_v(IN);
2) Lines 792 to 800 in contact.m:
R.lon_rho(IstrR+3:IendR+3,JstrR+3)=G(rg).lon_rho(IstrR:IendR,JstrR);
R.lon_rho(IstrR+3:IendR+3,JendR+3)=G(rg).x_rho(IstrR:IendR,JendR);
R.lon_rho(IstrR+3,JstrR+4:JendR+2)=G(rg).lon_rho(IstrR,JstrR+1:JendR-1);
R.lon_rho(IendR+3,JstrR+4:JendR+2)=G(rg).x_rho(IendR,JstrR+1:JendR-1);

R.lat_rho(IstrR+3:IendR+3,JstrR+3)=G(rg).y_rho(IstrR:IendR,JstrR);
R.lat_rho(IstrR+3:IendR+3,JendR+3)=G(rg).y_rho(IstrR:IendR,JendR);
R.lat_rho(IstrR+3,JstrR+4:JendR+2)=G(rg).lon_rho(IstrR,JstrR+1:JendR-1);
R.lat_rho(IendR+3,JstrR+4:JendR+2)=G(rg).y_rho(IendR,JstrR+1:JendR-1);
should be
R.lon_rho(IstrR+3:IendR+3,JstrR+3)=G(rg).lon_rho(IstrR:IendR,JstrR);
R.lon_rho(IstrR+3:IendR+3,JendR+3)=G(rg).lon_rho(IstrR:IendR,JendR);
R.lon_rho(IstrR+3,JstrR+4:JendR+2)=G(rg).lon_rho(IstrR,JstrR+1:JendR-1);
R.lon_rho(IendR+3,JstrR+4:JendR+2)=G(rg).lon_rho(IendR,JstrR+1:JendR-1);

R.lat_rho(IstrR+3:IendR+3,JstrR+3)=G(rg).lat_rho(IstrR:IendR,JstrR);
R.lat_rho(IstrR+3:IendR+3,JendR+3)=G(rg).lat_rho(IstrR:IendR,JendR);
R.lat_rho(IstrR+3,JstrR+4:JendR+2)=G(rg).lat_rho(IstrR,JstrR+1:JendR-1);
R.lat_rho(IendR+3,JstrR+4:JendR+2)=G(rg).lat_rho(IendR,JstrR+1:JendR-1);
3) Line 142 in inp_par.F:

Code: Select all

      CALL set_contact (1, model)
might be

Code: Select all

      DO ng=1,Ngrids
        CALL set_contact (ng, model)
      END DO
Thanks if you can take a look at them.

Chris

User avatar
arango
Site Admin
Posts: 1367
Joined: Wed Feb 26, 2003 4:41 pm
Location: DMCS, Rutgers University
Contact:

Re: Possibly 3 nesting-related bugs

#2 Unread post by arango »

The corrections to contact.m are correct, great catch :!: I missed those. Also, other spherical coordinates missing in the R-structure in function refine_coordinates. Please update

However, the correction to inp_param.F is incorrect. The code is fine as it is. The call to routine set_contact should be done once with:

Code: Select all

     CALL set_contact (1, model)
Thank you for reporting this problem in the contact.m Matlab script.

cae
Posts: 36
Joined: Mon Jun 30, 2003 5:29 pm
Location: UC Santa Cruz

Re: Possibly 3 nesting-related bugs

#3 Unread post by cae »

Thanks, Hernan.

I was least sure of the change to inp_param.F. Thank you for clarifying.

Chris

cae
Posts: 36
Joined: Mon Jun 30, 2003 5:29 pm
Location: UC Santa Cruz

Re: Possibly 3 nesting-related bugs

#4 Unread post by cae »

Hi again,

Thanks for correcting contact.m. There's still one spot that needs fixing:

Line 1695 in contact.m:

Code: Select all

      C.mask_v   = G(rg).mask_u(IN);
should be

Code: Select all

      C.mask_v   = G(rg).mask_v(IN);
Another issue that I'm seeing with a refinement nesting arrangement is that the free surface of the fine grid is not making its way back to the coarse grid. I've found this to be correctable with the following adjustment. As before, it may be wrong, and I'm sorry if the code is correct as is.

In nesting.F (revision 668), lines 2033-2061 look like:

Code: Select all

!
!  Free-surface.
!
# ifdef SOLVE3D
            CALL fine2coarse2d (rg, dg, model, tile,                    &
     &                          r2dvar, RefineScale(dg),                &
     &                          cr, Rcontact(cr)%Npoints, Rcontact,     &
     &                          LBiD, UBiD, LBjD, UBjD,                 &
     &                          LBiR, UBiR, LBjR, UBjR,                 &
#  ifdef MASKING
     &                          GRID(dg)%rmask,                         &
     &                          GRID(rg)%rmask,                         &
#  endif
     &                          COUPLING(dg)%Zt_avg1,                   &
     &                          COUPLING(rg)%Zt_avg1)
# else
            CALL fine2coarse2d (rg, dg, model, tile,                    &
     &                          r2dvar, RefineScale(dg),                &
     &                          cr, Rcontact(cr)%Npoints, Rcontact,     &
     &                          LBiD, UBiD, LBjD, UBjD,                 &
     &                          LBiR, UBiR, LBjR, UBjR,                 &
#  ifdef MASKING
     &                          GRID(dg)%rmask,                         &
     &                          GRID(rg)%rmask,                         &
#  endif
     &                          OCEAN(dg)%zeta(:,:,knew(dg)),           &
     &                          OCEAN(rg)%zeta(:,:,1),                  &
     &                          OCEAN(rg)%zeta(:,:,2))
# endif
The #ifdef SOLVE3D portion (which is not mimicked in the immediately subsequent calls to fine2coarse2d for other variables) does not send the right information (I think) to fine2coarse2d. It is sending COUPLING, but should be sending OCEAN I think. I find that I can get zeta of the refined grid to be passed to the coarse grid if I remove the SOLVE3D portion:

Code: Select all

!
!  Free-surface.
!
            CALL fine2coarse2d (rg, dg, model, tile,                    &
     &                          r2dvar, RefineScale(dg),                &
     &                          cr, Rcontact(cr)%Npoints, Rcontact,     &
     &                          LBiD, UBiD, LBjD, UBjD,                 &
     &                          LBiR, UBiR, LBjR, UBjR,                 &
#  ifdef MASKING
     &                          GRID(dg)%rmask,                         &
     &                          GRID(rg)%rmask,                         &
#  endif
     &                          OCEAN(dg)%zeta(:,:,knew(dg)),           &
     &                          OCEAN(rg)%zeta(:,:,1),                  &
     &                          OCEAN(rg)%zeta(:,:,2))
Thanks for taking a look!

Chris

User avatar
arango
Site Admin
Posts: 1367
Joined: Wed Feb 26, 2003 4:41 pm
Location: DMCS, Rutgers University
Contact:

Re: Possibly 3 nesting-related bugs

#5 Unread post by arango »

Yes, they were still couple of wrong assignments for array C.mask_v in script contact.m. I corrected the script, thank you. Please update.

I need to look couple of the routines in nesting.F. I think that you are correct, thank you. During initialization, we initialize Zt_avg1 (filtered free-surface over all barotropic steps) instead of zeta. This requires a special IF-conditional in fine2coarse. However, I did changed the initialization and this is not longer needed because all the nesting processing is done after calls to ini_fields. I need to track this in the debugger; I need to verify that everything is correct.

I think that this also resolves the issues :arrow: reported earlier by Mark Hadfield. If this change is done, the grid refinement with the DOGBONE test case runs to the end without blowing-up.

It is going to take more realistic applications with the nesting to fine tune its algorithms. This is very complicated and appreciate everybody help during this beta-testing period.

cae
Posts: 36
Joined: Mon Jun 30, 2003 5:29 pm
Location: UC Santa Cruz

Re: Possibly 3 nesting-related bugs

#6 Unread post by cae »

Hi. Excellent. I'm glad if these are on the right track.

I'm sure you'll find this with the debugger, but there's similar logic (with the SOLVE3D CPPDEF and Zt_avg1) in lines 2154-2164 and 2190-2203 of nesting.F (revision 668). Note also that an END IF on line 2163 should be an END DO.

My realistic application still doesn't run very long, but it's getting there...

User avatar
arango
Site Admin
Posts: 1367
Joined: Wed Feb 26, 2003 4:41 pm
Location: DMCS, Rutgers University
Contact:

Re: Possibly 3 nesting-related bugs

#7 Unread post by arango »

Make sure that you have values for the variable interpolate in the contact points NetCDF file:

Code: Select all

        int interpolate(Ncontact) ;
                interpolate:long_name = "vertical interpolation at contact points logical switch" ;
                interpolate:flag_values = 0, 1 ;
                interpolate:flag_meanings = "false true" ;
....

 interpolate = 1, 1 ;
This is a tunable parameter to activate the vertical interpolation in the contact points.

cae
Posts: 36
Joined: Mon Jun 30, 2003 5:29 pm
Location: UC Santa Cruz

Re: Possibly 3 nesting-related bugs

#8 Unread post by cae »

Hi.

I have another section of code that may be in error. In mod_nesting.F (revision 668), lines 487-534 read:

Code: Select all

!
!-----------------------------------------------------------------------
!  Deactivate boundary condition switches if contact point lay on the
!  physical nested grid boundary.
!-----------------------------------------------------------------------
!
      DO cr=1,Ncontact
        rg=receiver_grid(cr)
        IF (RefinedGrid(rg).and.(RefineScale(rg).gt.0) ) THEN
          LBC_apply(rg) % west  = .FALSE.      ! This is a refinement
          LBC_apply(rg) % south = .FALSE.      ! grid, so we need to
          LBC_apply(rg) % east  = .FALSE.      ! avoid applying lateral
          LBC_apply(rg) % north = .FALSE.      ! boundary conditions
        ELSE
          DO ibry=1,4
            Imin=BRY_CONTACT(ibry,cr) % Ibmin  ! Deactivate full or
            Imax=BRY_CONTACT(ibry,cr) % Ibmax  ! partial lateral
            Jmin=BRY_CONTACT(ibry,cr) % Jbmin  ! boundary conditions
            Jmax=BRY_CONTACT(ibry,cr) % Jbmax
            SELECT CASE (ibry)
              CASE (iwest)
                IF ((Jmin.ne.ispval).and.(Jmax.ne.ispval)) THEN
                  DO j=Jmin,Jmax
                    LBC_apply(rg) % west (j) = .FALSE.
                  END DO
                END IF
              CASE (isouth)
                IF ((Imin.ne.ispval).and.(Imax.ne.ispval)) THEN
                  DO i=Imin,Imax
                    LBC_apply(rg) % south(i) = .FALSE.
                  END DO
                END IF
              CASE (ieast)
                IF ((Jmin.ne.ispval).and.(Jmax.ne.ispval)) THEN
                  DO j=Jmin,Jmax
                    LBC_apply(rg) % east (j) = .FALSE.
                  END DO
                END IF
              CASE (inorth)
                IF ((Imin.ne.ispval).and.(Imax.ne.ispval)) THEN
                  DO i=Imin,Imax
                    LBC_apply(rg) % north(i) = .FALSE.
                  END DO
                END IF
            END SELECT
          END DO
        END IF
      END DO

The first part of this is fine. RefinedGrid(rg) will be true and (RefineScale(rg).gt.0) if the receiver grid is an inner nest. So, in that case, lateral boundary conditions should not be determined in the usual way, but through the nesting.

The issue I am seeing is if RefineScale(rg) is 0, which will happen for the outermost grid, then the second part of the if-statement is executed, and the calculation of lateral boundary values for the outermost grid are turned off. For the refined grid scenario, I don't think this is what you want. Rather, I think you don't want any change to LBC_apply at all in this case.

I was able to fix the problem that I was having (in which lateral boundary values for the outermost grid were not changing from their initial values) by commenting out the entire ELSE section. I'm of course not sure if that's the right solution, or if it conflicts with intentions of the composite grid case, which I haven't thought about.

User avatar
arango
Site Admin
Posts: 1367
Joined: Wed Feb 26, 2003 4:41 pm
Location: DMCS, Rutgers University
Contact:

Re: Possibly 3 nesting-related bugs

#9 Unread post by arango »

Yes, I also noticed that today. I haven't fixed it yet. I was looking another problem in the debugger. I rewrote the fine2coarse routines in a more generic and compact way. I will release all this tomorrow, I think.

In mod_nesting.F, routine allocate_nesting, we need to have instead:

Code: Select all

      DO cr=1,Ncontact
        rg=receiver_grid(cr)
        IF (RefinedGrid(rg)) THEN
          IF (RefineScale(rg).gt.0) THEN
            LBC_apply(rg) % west  = .FALSE.    ! This is a refinement
            LBC_apply(rg) % south = .FALSE.    ! grid, so we need to
            LBC_apply(rg) % east  = .FALSE.    ! avoid applying lateral
            LBC_apply(rg) % north = .FALSE.    ! boundary conditions
          END IF          
        ELSE
          ...
        END IF
      END DO
Also, I have been working on a Matlab script to plot all the nested solutions (concatenation and overlays). I need to create animations to see what it is happening. It is difficult to follow the time evolution of a variable in the debugger.

I rechecked the issue about Zt_avg1 in fine2coarse2d routine. In 3D applications, we indeed need to update Zt_avg1 because this is the variable that we use outside of the 2D engine to compute other quantities like depths and thicknesses (see set_depth). The assignment for zeta(:,:,1:2) is done in set_zeta from Zt_avg1.

Thank you for help. It is important to know how this works when setting complex configurations. We all are getting some experience. We put a lot of thinking on this nesting design. We just need to be sure that everything is working as it is supposed to and we don't have bugs...

User avatar
m.hadfield
Posts: 521
Joined: Tue Jul 01, 2003 4:12 am
Location: NIWA

Re: Possibly 3 nesting-related bugs

#10 Unread post by m.hadfield »

Thanks for all your effort on this, Hernan. It's impressive looking stuff, but quite complicated, obviously.

cae
Posts: 36
Joined: Mon Jun 30, 2003 5:29 pm
Location: UC Santa Cruz

Re: Possibly 3 nesting-related bugs

#11 Unread post by cae »

Hi.

Thanks so much, Hernan, for developing this nesting code and continuing to fix it. As you say, it's very complicated, but it will a great capability for everyone once it's working. We all appreciate it.

I've found what may be another bug. In nesting.F (revision 671), lines 221-227 read:

Code: Select all

!
!  Update coarse grid depth variables. We have a new coarse grid
!  adjusted free-surface, Zt_avg1.
!
          DO tile=first_tile(ng),last_tile(ng),+1
            CALL set_depth (ng, tile)
          END DO
I think here that ng should be replaced by ngc in all three locations. ngc is the donor grid for this refinement grid, and I think that is what is intended based on the included comment. If ng is used then (I think) the wrong depth levels are being used in the subsequent lines to transfer the 3d fields from fine to coarse grids.

As always, I apologize if this is not an issue, or if it's a red herring. But with this change, I'm able to run beyond a handful of time-steps and without grid-point noise developing in the contact region (though other nesting problems still develop later :) ).

User avatar
arango
Site Admin
Posts: 1367
Joined: Wed Feb 26, 2003 4:41 pm
Location: DMCS, Rutgers University
Contact:

Re: Possibly 3 nesting-related bugs

#12 Unread post by arango »

Yes, thank you Chris. Good catch. This is one of those copy and paste errors. We are solving currently for the refinement grid. Its thickness and depths arrays are computed after we solve the vertically integrated equations (solve2d) but not for the coarser grid since we are done for that time-step.

I rewrote the 2D momentum boundary forcing for the refinement grid. I am still testing it. I discovered a one point shift in the contact points NetCDF (on_boundary variable). I need to find a solution in the Matlab script contact.m. It is a matter of what the intrinsic function inpolygon returns. If the refinement point lies on the right (top) edge of the cell, it returns the wrong coarse grid forcing point:

Code: Select all

                  1 ______ p______ 4
                    |             |
                    |             |
                    |             p
                    |             |
                    |_____________|
                  2                3
So I need to make an special rule to select the correct grid point to force the boundary. I missed this one...

luzgarcia
Posts: 3
Joined: Tue Oct 26, 2010 1:37 am
Location: Instituto Español de Oceanografía

Re: Possibly 3 nesting-related bugs

#13 Unread post by luzgarcia »

Hi,

I am Luz García, working at the Ocean Modelling group of the Spanish Institute of Oceanography in A Coruña, Spain. We have been running for several years a realistic configuration of ROMS for NW Iberia using the Agrif one way nesting capabilities. In the last weeks, we have been trying to adapt our nested configuration (refined!) to run with the last ROMS Rutgers version (v. 3.7 downloaded the 2014/03/18). We have experienced several problems in setting up the realistic configuration and ensuring volume conservation in the refined grid, we would like to share with you some of our findings on possible bugs, as well as some doubts hoping that we would get some feedback.

First of all, the CPP options related to nesting we are using are:

Code: Select all

#define NESTING
#define ONE_WAY
#undef TIME_INTERP_FLUX
#undef NO_CORRECT_TRACER
And these are the comments/questions

1) Subroutine nesting.F (revision 719)

1.1)After reading this post and Hernan's answer, I got convinced that there was a bug in the calculation of zeta, but then I got confused after reading the comments on Zt_avg1 here. Is the SOLVE3D portion of the code correct or should we comment it??

1.2) I think there is a bug in line 2318. We have

Code: Select all

 CALL exchange_v2d_tile (rg, tile,                         &
     &                                LBiR, UBiR, LBjR, UBjR,           &
     &                                OCEAN(rg)%ubar(:,:,k))
but we should have

Code: Select all

 CALL exchange_v2d_tile (rg, tile,                         &
     &                                LBiR, UBiR, LBjR, UBjR,           &
     &                                OCEAN(rg)%vbar(:,:,k))
1.3)Possible bug in line 2365. We have

Code: Select all

CALL mp_exchange3d (rg, tile, model, 2,                       &
     &                        LBiR, UBiR, LBjR, UBjR, 1, N(rg),         &
     &                        NghostPoints,                             &
     &                        EWperiodic(rg), NSperiodic(rg),           &
     &                        OCEAN(rg)%u(:,:,:,nstp(rg)),              &
     &                        OCEAN(rg)%u(:,:,:,nstp(rg)))
but we should have

Code: Select all

CALL mp_exchange3d (rg, tile, model, 2,                       &
     &                        LBiR, UBiR, LBjR, UBjR, 1, N(rg),         &
     &                        NghostPoints,                             &
     &                        EWperiodic(rg), NSperiodic(rg),           &
     &                        OCEAN(rg)%u(:,:,:,nstp(rg)),              &
     &                        OCEAN(rg)%v(:,:,:,nstp(rg)))
2) For a total refinement nesting (all OBs of the fine grid are inside the coarse grid), we would not expect that a climatology would be needed for the finer grid, since all the necessary information should be provided at the boundaries by the solution at the coarser grid (nesting option Nest for the OBC in the fine grid). However, we get a configuration error for not providing a climatology at the resolution of the finer grid.
We think that this problem is solved by just adding the following conditional loop

Code: Select all

IF (.not.(RefinedGrid(ng).and.RefineScale(ng).gt.0)) THEN
at around line 862 in get_data.F when the netcdf climatology file is read and including and END IF at line 947.

A similar procedure must be done in subroutine set_data.F by adding

Code: Select all

IF (.not.(RefinedGrid(ng).and.RefineScale(ng).gt.0)) THEN
in line 1132 and END IF in line 1219.

We are not sure if this is the best way to avoid the problem or if it is reflecting that something is wrong in our configuration, but with these corrections the simulation at least runs.

We are still struggling to get it run, specially with volume conservation at the fine grid. Any hint or ideas would be much appreciated.

Thanks a lot,

Luz García

User avatar
arango
Site Admin
Posts: 1367
Joined: Wed Feb 26, 2003 4:41 pm
Location: DMCS, Rutgers University
Contact:

Re: Possibly 3 nesting-related bugs

#14 Unread post by arango »

Yes, the calls to the exchange routines exchange_v2d_tile and mp_exchange3d need to be corrected as you mentioned above. However, your corrections to get_data and set_data are incorrect :!: We still can use climatology fields in refined grids for other purposes :!: Notice that the processing of climatology fields are controlled by logical switches read from standard input file for each nested grid. For example, if we just need temperature and salinity in the coarser grid we just set:

Code: Select all

! Logical switches (TRUE/FALSE) to read and process climatology fields.
! See glossary below for details.

     LsshCLM == F F                        ! sea-surface height
      Lm2CLM == F F                        ! 2D momentum
      Lm3CLM == F F                        ! 3D momentum

  LtracerCLM == T T   \                    ! temperature, salinity, inert, grid 1
                F F                        ! temperature, salinity, inert, grid 2
So you can manipulate all the processing of climatology field in all nested grids. Your fix will actually introduce a bug :!:

With respect the the volume conservation, you need to undef CPP-option ONE_WAY. This was a surprise to us too at the beginning. The nesting capability of this version of ROMS are very unique. It is not formulated as a boundary problem but as a horizontal flux operator problem. If you have two-way nesting, the volume conservation problem will go away.

luzgarcia
Posts: 3
Joined: Tue Oct 26, 2010 1:37 am
Location: Instituto Español de Oceanografía

Re: Possibly 3 nesting-related bugs

#15 Unread post by luzgarcia »

Thank you for your reply, it's all clear with get_data and set_data.

We have still been playing a little bit with the matter of volume conservation. I include a figure here showing the SSH of our model results at a grid position to illustrate my comments below:

1) With TWO_WAY applications there is no problem of volume conservation (see the green line in the figure). Finally, the block SOLVE_3D should be uncommented (see my previous post 1.1), just as it is in the trunk version.
2) But we wanted ONE_WAY to work properly and we checked that:
2.1) We already knew that with the code as it is we get no volume conservation (blue line in the figure).
2.2) Taking into account that some corrections are needed to ensure volume conservation (see one-way nesting description of ROMS-AGRIF http://www.brest.ird.fr/personnel/ppenv ... od2006.pdf or ROMS-UCLA http://www.sciencedirect.com/science/ar ... 031000082X), we decided to activate the VolCons option only for the refined grid. The results correspond to the red line in the plot. There is a clear inconsistency that it seems to result from the fact that the obc_volcons subroutines are called in the barotropic steps (step2d.F) whereas nesting is performed in the baroclinic steps (main3d.F).
2.3) We think that the correction to ensure volume conservation should be done in nesting.F, so we adapted this subroutine in order to apply the global barotropic correction that is implemented in obc_volcons (the VolCons option in ocean.in is not activated this time). I attach our adapted nesting.F subroutine which is an attempt to impose volume conservation to 2D velocities. Although we get volume conservation with this correction (ONE_WAY nesting), we are not able to reproduce the amplitude of the tidal signal (see pink line in the figure). We are aware that in our draft subroutine, no volume conservation constraint is applied to 3d fluxes

Anybody has any idea that can help us fixing this???

Thank you very much!!

Luz García
Attachments
nesting.F
(199.48 KiB) Downloaded 469 times
SSH_refinedgrid_nesting.jpg

User avatar
arango
Site Admin
Posts: 1367
Joined: Wed Feb 26, 2003 4:41 pm
Location: DMCS, Rutgers University
Contact:

Re: Possibly 3 nesting-related bugs

#16 Unread post by arango »

Of course, this makes a lot of sense to me. We cannot apply volume conservation in a regional tidally forced application :!: Tides do not conserve volume in such applications. If you do so, you will alter the tidal signal. Therefore, this explains very well your pink curve. So now you know what happens when you impose volume conservation in a tidally forced application.

I read AGRIF papers that you mentioned long time ago. If I recall correctly none of those applications were forced with tides. I need to think about the changes that you made to nesting.F. I don't see nothing wrong with the changes for non tidal one-way solutions. I bet that if you remove the tidal forcing in your application, the ONE_WAY nesting will be fine with your volume conservation adjustment.

If you need to have tides in your nesting applications, you need to have two-way interactions between the grids.

Recall the the nesting methodology in AGRIF is just in terms of boundary conditions. Our methodology is quite different and we never apply boundary conditions in refinement applications. We just increase the contact points between grids and evaluate the full horizontal operator at the physical boundary of the refined grid. The contact points are spatially/time interpolated from the coarser grid. This is similar to what we do with periodic boundary conditions. This methodology was harder to implement because we needed to change the i- and j-loops of the entire numerical kernel. It took long time to achieve and test this.

For more information about contact points check :arrow: WikiROMS.

rduran
Posts: 152
Joined: Fri Jan 08, 2010 7:22 pm
Location: Theiss Research

Re: Possibly 3 nesting-related bugs

#17 Unread post by rduran »

I was looking at the wikiroms page that Hernan posted, and found that the donor/receiver grid exchange information through a bilinear interpolation

Code: Select all

The horizontal interpolation weights are as follows: 
Hweight(1,:) = (1.0 - p) * (1.0 - q) 
Hweight(2,:) = p * (1.0 - q) 
Hweight(3,:) = p * q 
Hweight(4,:) = (1.0 - p) * q 
Its straightforward to show that spatial derivatives of a field that has been bilinearly interpolated from a coarser to a finer grid are discontinuous. Now imagine you have a flow in geostrophic balance (to first order if you like; this includes quite a bit of the flows simulated with ROMS). The geostrophic orthogonal velocity is equal to the pressure gradient (give or take a constant) which will be discontinuous across donor-grid cell boundaries (except possibly at one point in each cell boundary but that is unlikely). This also means that the velocity is discontinuous at each donor-grid cell boundary.

Potential implications are the creation of perturbations that will travel as waves, and velocity perturbations that will affect trajectories for those computing them.

I wouldn't be surprised if there are more consequences to it. Unfortunately I am currently unable to look into this a bit more carefully but I none the less thought I would bring it to your attention.

Wouldn't it be better to use an interpolation scheme with (at least) continuous first derivatives?

Correction: the disconinuity is along the borders of the square with corners at (Idg,Jdg), (Idg+1,Jdg), (Idg,Jdg+1), (Idg+1,Jdg+1) in the figure you can find here

Post Reply