Hello, I've created a NetCDF convention to amass into one
outputfile results contained in many (e.g. 50) NetCDF files
(of the AnDI for spectrometry type). Data in the new
convention include two 3-D arrays, with their unlimited
dimension being the number of input files. But these
conventions/types may be unimportant to my question.
After some coding false starts, I can now iterative through
a list of input files and load data into the output file
without obvious memory problems. But all stops at iteration
38, when the output file just exceeds 2 gigabytes,
specifically, when it is 2,097,144 kilobytes.
My question: Have I run up against a NetCDF file size
Limitation? Or is this some glitch in R or in Windows?
Thanks,
-John
P.S. I'm working in the R (v. 2.3.1) statistical programming
environment. I use Pavel Michna's RNetCDF (v.1.2-1) package
within R to access netcdf (v. 3) library functions. This is
all happening on a Windows XP laptop with 512 M RAM. The command
in my iteration that fails is "var.put.nc". The error is
generated from a line "stop(nc$errmsg, call. = FALSE). The error
message is "Invalid Argument". -JT
John Thaden PhD
University of Arkansas for Medical Sciences
Little Rock AR, USA
Confidentiality Notice: This e-mail message, including any attachments, is for
the sole use of the intended recipient(s) and may contain confidential and
privileged information. Any unauthorized review, use, disclosure or
distribution is prohibited. If you are not the intended recipient, please
contact the sender by reply e-mail and destroy all copies of the original
message.
==============================================================================
To unsubscribe netcdfgroup, visit:
http://www.unidata.ucar.edu/mailing-list-delete-form.html
==============================================================================