Re: slow write on large files




Thanks to a nudge from Russ Rew, I've solved the problem.

I can now write files right up to the 2GB limit without problems.

Using nc_set_fill() with the parameter NC_NOFILL makes the problem go away.
I can't claim to understand why, but it works.  Also, this has the added
benefit of creating the file very quickly at inital creation since it
doesn't fill in the variable section of the file.

I would think that a file prefilled with dummy data could be written to
faster than a file that has not been prefilled, but I'm apparently wrong.
I don't know if this situation is specific to Windows, or if it will show
up in on other platforms as well.

Jim




|---------+---------------------------------->
|         |           James Garnett          |
|         |           <James_Garnett@raytheon|
|         |           .com>                  |
|         |           Sent by:               |
|         |           owner-netcdfgroup@unida|
|         |           ta.ucar.edu            |
|         |                                  |
|         |                                  |
|         |           06/09/2004 01:31 PM    |
|         |           Please respond to James|
|         |           Garnett                |
|         |                                  |
|---------+---------------------------------->
  
>-----------------------------------------------------------------------------------------------------------------|
  |                                                                             
                                    |
  |        To:      netcdfgroup@xxxxxxxxxxxxxxxx                                
                                    |
  |        cc:                                                                  
                                    |
  |        Subject: slow write on large files                                   
                                    |
  
>-----------------------------------------------------------------------------------------------------------------|








I'm using NetCDF to stream video to disk real time.  I'm doing this on a
PC, running windows, and C, Windows implementation of NetCDF.  Using Visual
C++ 6.0.

The data is coming in at about 7.5 MB/second.
I'm using successive calls to nc_put_vara_short() to write each frame of
video data.
When my file size is small (1000 frames of data or less), I have no
problem.  When my file size is much larger (1500 frames or more), my
application runs horribly.  It appears that the calls to nc_put_vara_short
are not keeping up with the incoming data.  Yes, I'm buffering in RAM, but
my buffer is limited, and eventually I overflow my buffer.  The real bugger
is that the problems start occuring very early in a large file, it's not
like it's fine for the first 1000 frames, then starts lagging.  I'm seeing
problems very early on in a large file.

Is there something about the way that nc_put_vara_short is coded that it
slows down based on the TOTAL SIZE of the file (or just the Variable
portion of the file).

When I write the data out to disk, using just CFile::Write() instead of
using the NetCDF library, I have no problems.


Thanks,

Jim



* CFile is an MFC (Microsoft Foundation Classes) provided class to make
disk i/o simple.





  • 2004 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the netcdfgroup archives: