John Caron wrote:
> Greg Sjaardema wrote:
>> John Caron wrote:
>>>
>>> Greg Sjaardema wrote:
>>>> As a quick answer to the question, we (Sandia Labs) use netcdf
>>>> underneath our exodusII
>>>> file format for storing finite element results data.
>>>>
>>>> If the mesh contains #nodes nodes and #elements elements, then there
>>>> will be a dataset of the size #elements*8*4 (assuming a hex element
>>>> with
>>>> 8 nodes, 4 bytes/int) to store the nodal connectivity of each hex
>>>> element in a group of elements (element block). Assuming 4GiB, this
>>>> limits us to ~134 Million elements per element block (using CDF-2)
>>>> which
>>>> is large, but not enough to give us more than a few months breathing
>>>> room. Using CDF-1 format, we top out at about 30 million
>>>> elements or
>>>> less which is hit routinely.
>>> Im not sure if I understand the problem yet:
>>>
>>> In the file you sent me, you use time as the record variable.
>> Yes.
>>> Each record variable must be less than 2^32, not counting the record
>>> dimension. So you can have about 2^29 elements, assuming each element
>>> is 8 bytes. And you can have 2^32 time steps.
>> Yes, but the record variables are currently not the limiting area.
>>> The non-record variables are dimensioned (num_elements, num_nodes).
>>> Number of nodes seems to be 8, and these are ints, so you have 32
>>> bytes * num_elements, so you can have a max of 2^27 elements = 134
>>> million elements.
>> Yes, this is where we are hitting the limit.
>>> Currently the largest you have is 33000. Do you need more than 2^27 ?
>> The file I sent was representative. As I think I mentioned in that
>> email, we have models ranging in size from 1 element up to 250 million
>> elements and everything in between. The header I sent was just to show
>> how the models are set up, not a model that is the largest we typically
>> use. Yes, we definitely need more than 2^27 elements.
>>> I think Im not sure what you mean by "a few months breathing room".
>> In a few more months if we do not change either the underlying netcdf
>> limits or the way that our exodusII format is using netcdf, we will have
>> to tell users that the models they require cannot be stored.
>
> ok, thanks for clarifying that.
>
> do you expect to need more than 2^31 = 2.147 * 10^9 elements ?? that
> is the usual limit for array indices, and it would be difficult (at
> least in Java) to read more than that at once. reading sections at a
> time would be ok.
> is there any "natural" way to break up the element array into more
> than one dimension?
2^31 elements should be OK for awhile. The element array can be broken
up into (num_elements, num_nodes) very easily. The "num_nodes"
dimension would typically range from 1 for SPH elements up to 27 for
higher-order hex elements. The "num_elements" dimension would range
from 1 up to 2^31.
--Greg
>
> ==============================================================================
>
> To unsubscribe netcdfgroup, visit:
> http://www.unidata.ucar.edu/mailing-list-delete-form.html
> ==============================================================================
>
>
>
==============================================================================
To unsubscribe netcdfgroup, visit:
http://www.unidata.ucar.edu/mailing-list-delete-form.html
==============================================================================