Hi Ramakrishnan
Do you know what is stored in the "time" variable, if anything?
ncdump -c prod.nc
should tell (prints the header and the coordinate variables, including
time).
A brute force method to fix a broken netCDF (if the file size is not
gigantic)
that I used in the past is this:
First use ncdump to dume the whole file to a text file,
say ncdump prod.nc > prod.cdl
cdl is the "common data language:", a text representation of a netCDF file.
Then, second, edit/doctor the prod.cdl text file (with vi, emacs, etc).
Replace the time dimension 0 by the correct value.
If the time coordinate variable is wrong, replace each of its entries by
the correct value.
Be very careful so as not to mess up the cdl syntax (there are commas
separating values,
and other symbols separating variables, etc).
Save the edited prod.cdl file, possibly with a different name, say
prod_new.cdl
Finally use ncgen to regenerate the prod.nc file from the edited cdl:
ncgen -b -o prod_new.nc prod_new.cdl
This will create the prod_new.nc file, presumably with everything fixed
(assuming the original file had
the correct data, except for the time dimension and time coordinate
variable perhaps).
I am citing from memory here, so please double check the ncgen man page
before you try.
For details on the ncgen syntax, see:
https://linux.die.net/man/1/ncgen
I used this primitive method to fix a few damaged netCDF files in the past,
in desperate situations like yours.
I am not proud of its elegance,
but sometimes elegance is what you care less about,
and it worked for me.
Also, you may need to do "ncdump -k prod.nc" beforehand,
to check which type of netcdf format your file has.
Then use the same format in the ncgen command above.
I hope it helps.
Gus Correa
On Fri, Feb 4, 2022 at 9:52 AM Ramakrishnan N <ram.n.krishnan@xxxxxxxxx>
wrote:
> I have a netcdf file (prod.nc) that contains time series from a molecular
> dynamics simulation (Amber force field, OpenMM engine, parmed
> netCDFReporter). The netCDFReporter had some problems and as a result, the
> number of frames in the netcdf file is zero. Given below is the ncdump for
> the file:
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> *$ncdump -h prod.nc <http://prod.nc>netcdf prod {dimensions: frame
> = UNLIMITED ; // (0 currently) spatial = 3 ; atom = 20504
> ;variables: char spatial(spatial) ; float time(frame) ;
> time:units = "picosecond" ; float coordinates(frame, atom,
> spatial) ; coordinates:units = "angstrom" ;// global
> attributes: :Conventions = "AMBER" ;
> :ConventionVersion = "1.0" ; :application = "AmberTools" ;
> :program = "ParmEd" ; :programVersion =
> "3.4.0+11.g1be8ca0f" ; :title = "ParmEd-created trajectory"
> ;}*
>
> However, the netcdf file has non-zero size (that increases linearly with
> the number of frames stored) which implies that it certainly has the data
> written into it. I tried a number of tools (nco tools, netCDF4, scipy
> netcdf reader, xarray) to access the missing data but have not succeeded.
>
>
>
>
>
> *I have two questions:1. Does the file contain real data?2. If the former,
> is there a way to retrieve the data and create a new netcdf file?*
>
> I am desperately looking to salvage near 3 microseconds of simulation data
> which would take more than 2 months to generate. I would greatly appreciate
> it if anyone can provide me with some insight into this problem.
>
> The attached netcdf file has 14 frames that can be used to examine the
> issue.
>
> Thanks in advance
>
> Best
> Ram
>
>
> _______________________________________________
> NOTE: All exchanges posted to Unidata maintained email lists are
> recorded in the Unidata inquiry tracking system and made publicly
> available through the web. Users who post to any of the lists we
> maintain are reminded to remove any personal information that they
> do not want to be made public.
>
>
> netcdfgroup mailing list
> netcdfgroup@xxxxxxxxxxxxxxxx
> For list information or to unsubscribe, visit:
> https://www.unidata.ucar.edu/mailing_lists/
>