Thanks for your email! The first 4 bytes of the file are "CDF1", so I
don't think I am seeing the CDF2 issue.
It doesn't appear to me that I am seeing truncated files, because I can
run NCDump using the webstart netcdfTools on the unidata web site and I
see a successful output.
But this didn't work on June 22. On that day the netcdfTools threw an
exception:
java.io.IOException: File is truncated calculated size= 23262660 actual
= 23262596
at ucar.nc2.N3header.read(N3header.java:236)
at ucar.nc2.N3iosp.open(N3iosp.java:92)
at ucar.nc2.NetcdfFile.<init>(NetcdfFile.java:730)
at ucar.nc2.NetcdfFile.open(NetcdfFile.java:329)
at ucar.nc2.NetcdfFile.open(NetcdfFile.java:214)
at ucar.nc2.NetcdfFile.open(NetcdfFile.java:161)
at
ucar.nc2.dataset.NetcdfDataset.openFile(NetcdfDataset.java:171)
at ucar.nc2.ui.ToolsUI$NCdumpPanel.run(ToolsUI.java:849)
at ucar.nc2.ui.ToolsUI$GetDataTask.run(ToolsUI.java:1917)
at java.lang.Thread.run(Thread.java:534)
and now it works fine on the same file (it is a MINC file from the
Montreal group).
I am willing to go in and patch the version 1 code myself, but it would
be helpful if I knew what I was looking for. Do you think you could
lend me some insight as to what the problem/fix was?
Many thanks,
Scott
On Fri, 2005-11-18 at 09:19, John Caron wrote:
> Scott Neu wrote:
> > Hello,
> >
> > I know I'm somewhat behind the times, but a while ago I downloaded the
> > NetCDF java library version 1 and integrated that into my code.
> > Everything has been great up until last July, when I started to receive
> > reports that my code was no longer able to parse certain NetCDF files.
> >
> > The obvious solution is to upgrade to version 2, but the class library
> > has changed too much for me to do this quickly (and I just don't have
> > the time right now).
> >
> > My question is: is there a quick patch I can write to parse these newer
> > NetCDF files using the version 1 library?
>
> no, im afraid version 1 is no longer maintained.
>
> Did a substantial change
> > occur in the definition of NetCDF files?
>
> one possibility is that you are seeing truncated files. double check with
> the C library ncdump program, and dump the values of the last variable in the
> file. if that fails, you have a truncated file.
>
> otherwise, maybe you are seeing the "truncated netcdf problem" where the
> writer doesnt write all the bytes to the file. this has always been there and
> i guess you are just seeing these now (??).
>
> there has been a change to allow files > 2 GB, but those files are not in
> wide circulation. dump out the first 4 bytes of your file, old version has
> CDF1 and new has CDF2 (where the 4th byte is numeric, not character)
>
> >
> > I get end-of-file errors as the offsets to the variable data are larger
> > than the file lengths themselves.
> >
> > thanks for any helpful advice,
> > Scott
> >
> > java.io.EOFException
> > at
> > javax.imageio.stream.ImageInputStreamImpl.readInt(ImageInputStreamImpl.java:235)
> > at
> > ucar.netcdf.NetcdfStream$V1IntegerIo.readArray(NetcdfStream.java:1090)
> > at ucar.netcdf.NetcdfStream$V1Io.toArray(NetcdfStream.java:757)
> > at ucar.netcdf.NetcdfStream$V1Io.toArray(NetcdfStream.java:721)
> > at ucar.netcdf.Variable.toArray(Variable.java:296)
> > at ucar.nc2.Variable.read(Variable.java:229)
> > at ucar.nc2.NetcdfStream.cacheData(NetcdfStream.java:102)
> >