Consider the following situation...
I want to run ROMS for 100 days, i.e. from t=0 to t=100 (days). My forcing or climatological data consists of 100 consecutive one-day averages, and the times at the centre of the averaging periods are 0.5,1.5,...,99.5. I start the model at t=0 and it immediately stops, complaining that the model time is less than the minimum time in the forcing or climatological data. A similar problem would occur at the other end of the run, when the time exceeds 99.5, if I could run it that far.
Now, there are many ways of working around this, but to save me the bother I would like the model to handle it, by using the t=0.5 data for the first half day of the model run and the t=99.5 data for the last half day of the model run. Between 0.5 and 99.5 I would like it to interpolate linearly, as it does now.
Do other people agree this would be a good thing? It should be easy to implement.
Time-extrapolation of climatological/forcing data
- m.hadfield
- Posts: 521
- Joined: Tue Jul 01, 2003 4:12 am
- Location: NIWA
- arango
- Site Admin
- Posts: 1367
- Joined: Wed Feb 26, 2003 4:41 pm
- Location: DMCS, Rutgers University
- Contact:
Well, I see your point. However, this is extremely problematic because the logic is generic for all ROMS algorithms. If we do what you recommend, it will be fatal for all the adjoint-based algorithms when the basic state nonlinear solution is time-interpolated to linearize the adjoint model.
In your case, the 0.5 interval is not problematic. However, if the interval is much larger like a month or a season it will be biased. I assume that climatology is generated from another ROMS run, right?
I think that you have four options:
1) Add an additional record for t=-0.5 that it is the same as t=0.5.
2) Start ROMS at t=0.5
3) Change the value of the first record of the time variable from 0.5 to 0.0
4) Introduce a cpp option that hacks the code to do what you want.
I will have to think carefully about this. I can see times when you want both behaviors in the same run. I assume that an attribute can be added to the NetCDF time variable to do just that when you want it.
In your case, the 0.5 interval is not problematic. However, if the interval is much larger like a month or a season it will be biased. I assume that climatology is generated from another ROMS run, right?
I think that you have four options:
1) Add an additional record for t=-0.5 that it is the same as t=0.5.
2) Start ROMS at t=0.5
3) Change the value of the first record of the time variable from 0.5 to 0.0
4) Introduce a cpp option that hacks the code to do what you want.
I will have to think carefully about this. I can see times when you want both behaviors in the same run. I assume that an attribute can be added to the NetCDF time variable to do just that when you want it.
- m.hadfield
- Posts: 521
- Joined: Tue Jul 01, 2003 4:12 am
- Location: NIWA
OK, Hernan, no need to bother.
What I typically do is duplicate the t=0.5 record at t=-0.5 using ncks. But I keep forgetting to do this, so I was looking for a solution that requires less effort on my part. However, if a solution in ROMS is going to complicate the code significantly, then best leave it as it is.
What I typically do is duplicate the t=0.5 record at t=-0.5 using ncks. But I keep forgetting to do this, so I was looking for a solution that requires less effort on my part. However, if a solution in ROMS is going to complicate the code significantly, then best leave it as it is.
- m.hadfield
- Posts: 521
- Joined: Tue Jul 01, 2003 4:12 am
- Location: NIWA
Re: Time-extrapolation of climatological/forcing data
This last assertion is incorrect. Although ROMS will issue an error message and stop if it finds the initial model time to be less than the time of the first record in a forcing file, it will continue after the last record in a forcing file, keeping data from that record indefinitely. (I imagine the same applies to climatology and boundary files, since they use the same logic in GET_CYCLE, but haven't actually checked.)m.hadfield wrote:Consider the following situation...
I want to run ROMS for 100 days, i.e. from t=0 to t=100 (days). My forcing or climatological data consists of 100 consecutive one-day averages, and the times at the centre of the averaging periods are 0.5,1.5,...,99.5. I start the model at t=0 and it immediately stops, complaining that the model time is less than the minimum time in the forcing or climatological data. A similar problem would occur at the other end of the run, when the time exceeds 99.5, if I could run it that far.
I'm sure everyone but me already knew that , but I thought I'd better document the situation correctly for posterity.
Re: Time-extrapolation of climatological/forcing data
Yes, I knew that, but if you go to restart that job at day 105 (or 99.7), it will then fail.
- m.hadfield
- Posts: 521
- Joined: Tue Jul 01, 2003 4:12 am
- Location: NIWA
Re: Time-extrapolation of climatological/forcing data
Nope that's not exactly what it does, either. Let's say we run the nonlinear model from t=0 to t=1.0 d with surface stress data at t=0 and t=0.5 d, no cycle_length attribute. At startup the model loads both records of stress data and then from t=0 to 0.5 d it interpolates linearly in time between them. Then when the model time passes 0.5 d it keeps the same two records of stress data but assigns a time of 1.0 d to the second one--see variable Tintrp in GET_2DFLD. It continues to interpolate linearly between the 2 records, but with new interpolation coefficients based on the new (incorrect) time for the second one.m.hadfield wrote:This last assertion is incorrect. Although ROMS will issue an error message and stop if it finds the initial model time to be less than the time of the first record in a forcing file, it will continue after the last record in a forcing file, keeping data from that record indefinitely.
Clearly this is wrong, so given this behaviour and the issue mentioned by Kate, the moral is: don't run your model past the last record in your forcing file. Perhaps the model should check for this?