blow up problem

Report or discuss software problems and other woes

Moderators: arango, robertson

Post Reply
Message
Author
nageswararao

blow up problem

#1 Unread post by nageswararao »

Dear all,
Can anyone fix what's the problem for blowing up my model. I am hereby enclosing the last lines of the output script. Please tell me the solution to solve this problem.


0: 99355 1394.930556 1.185092E+00 2.063833E+04 2.063952E+04 1.316904E+17 0
0: 99356 1394.944444 1.221343E+00 2.063845E+04 2.063968E+04 1.316910E+17 0
0: 99357 1394.958333 1.261493E+00 2.063857E+04 2.063983E+04 1.316917E+17 0
0: 99358 1394.972222 1.301413E+00 2.063867E+04 2.063997E+04 1.316925E+17 0
0: 99359 1394.986111 1.337026E+00 2.063876E+04 2.064009E+04 1.316932E+17 0
0: 99360 1395.000000 1.364751E+00 2.063883E+04 2.064019E+04 1.316940E+17 0
0: DEF_HIS - creating history file: roms_indian_ocean_his4y1_0047.nc
0: WRT_HIS - wrote history fields (Index=1,1) into time record = 0000001
0:
0: WRT_AVG - error while writing variable: scrum_time
0: into averages NetCDF file for time record: 15
0:
0: ROMS/TOMS - Output error ............ exit_flag: 3 0:
0:
0: Elapsed CPU time (seconds):
0:

Any help is appreciated.

Thanks in advace.

With regards & Happy New Year,
G.NageswaraRao.

User avatar
kate
Posts: 4091
Joined: Wed Jul 02, 2003 5:29 pm
Location: CFOS/UAF, USA

Re: blow up problem

#2 Unread post by kate »

nageswararao wrote: 0: WRT_AVG - error while writing variable: scrum_time
0: into averages NetCDF file for time record: 15
The most likely problem from what you show is that it won't let you create a file larger than 2 GB. How big is that averages file? It looks like you told the history writing to create a new file here - perhaps you should do that for the averages as well. It is also possible to recompile the NetCDF library in such a way as to allow writing larger files, but you still want to break them up at some point for long runs.

nageswararao

#3 Unread post by nageswararao »

Hi Kate,
Thank u for ur reply.U r right. The file size is 874774016B. How can I increase the file size in netcdf libraries that u had mentioned? What u had told is right, but I am unable to append those files if I wrote in different files since model is writing the ouput nc file in same time value. i.e., for first file it is writing in l=1, for second file also it is writing the ouput in l=1, etc. It is not changing the values of l when it is writing in different files. How can I come up with this problem? Please do reply.

Thanks in advance.

User avatar
kate
Posts: 4091
Joined: Wed Jul 02, 2003 5:29 pm
Location: CFOS/UAF, USA

#4 Unread post by kate »

The easiest way to increase the largest file size is to download and compile the beta version 3.6.2 of the Netcdf library. However, that doesn't look like your problem since your file is less than a gigabyte. Could you be running into quota problems or a full filesystem?

When you say I=i, do you mean it is writing to the first record? The value I check is ocean_time, to make sure it has the expected values. What exactly do you think is wrong?

nageswararao

#5 Unread post by nageswararao »

Ya, u r right.It is due to quote problem only. So now I had increased the quota size and I had run the model. Now it is not blowing up due to that cause. But it is blowing up in other way saying NaNQ and the kinetic energy went suddenly to negative value. Why it happened? I thought that it is due to the CFL criterian and i had changed the time step but already I ran the model for 5years. After that also it is blowing up, what it mean? Is it still wants to spin up? Please tell me is my idea of changing of time step is right or not.
In the previous mail I asked about that model is writing the output for different files in the same time step. I mean if i want to save the output in different files, the model is writing for all files the same L value(not i value). i.e.,in his_0001.nc it is writing output in L=1 and in his_0002.nc also it is writing output in L=1. How can I tell the model to assign the output to next time step value?

Thanks in advance.

Post Reply