how should I design my data files?

Hello everyone,

I'm planning on using netCDF format data files for our new
acquisition/playback system. The data are a number of
channels of analog data (1-8), which are either recorded or
being used to drive VCO's, and a number of channels of
timestamps of significant events which are recorded. A set 
of such data can occur any number of times within one experiment.

The analog data can be set to a fixed number of data points,
however the event data count is unknown beforehand. The number 
of sets is also unknown beforehand.

My problem is, how can I store a number of sets in one netCDF
file, if I have two unlimited dimensions, namely the event
timestamps, and the number of data sets? It looks like I
have to use a seperate file for each data set, is that right?
If an experiment's data is stored in 100's of little files, we're
bound to lose one of them :(. Should I use a preset number
of sets? I would prefer to keep the all the data for an experiment
in one big file.

This is partial CDL of my file:

netcdf file{
dimensions:
        sample = 10000;   <- for example, we know this before the experiment
        analog_channel = 8;                     ditto
        event_channel = 6;                      ditto
        set = unlimited;
        timestamp = unlimited;
variables:
        short analog(set, channel, sample);
        long  event( set, channel, timestamp);
}

Thanks for any advice. Is netCDF a good data format for such data?
Any other common format that would be better?

Bill Morrow    Clinical Neurosciences, University of Calgary
e-mail: morrow@xxxxxxxxxxxxxxx voice: (403) 220-6275 fax: (403) 283-8770 
http://www.cns.ucalgary.ca/~morrow
HMRB 105, 3330 Hospital Drive NW Calgary, Alberta, CANADA T2N 4N1


  • 1995 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the netcdfgroup archives: