Hi Carlos, responses are below.
Carlos Valiente wrote:
Hi! We are having memory exhaustion issues in our THREDDS setup (using
THREDDS 3.16.35 running under Tomcat 6.0.10 with Java 1.6.0_04-b12 on
SLES 9 Linux x86_64).
We have aggregated the following NetCDF files:
http://ensembles.ecmwf.int/thredds/catalog/demeter-non-agg/catalog.html
by variable:
http://ensembles.ecmwf.int/thredds/demeter/variables.html
using the following XML catalogue snippets:
[catalog.xml]
..
<datasetScan name="DEMETER Data - Non-aggregated"
ID="demeterNonAggregated"
path="demeter-non-agg"
location="/vol/data/demeter/"
serviceName="thisDODS">
<filter>
<include wildcard="*.nc"/>
</filter>
</datasetScan>
<catalogRef name="DEMETER data - Aggregated variables"
xlink:href="variables.xml"
xlink:title="DEMETER data - Aggregated variables" />
..
[variables.xml]
..
<dataset name="Geopotential"
ID="demeter-geopotential"
urlPath="demeter/g">
<serviceName>thisDODS</serviceName>
<netcdf
xmlns="http://www.unidata.ucar.edu/namespaces/netcdf/ncml-2.2">
<aggregation dimName="time" type="joinExisting">
<scan location="/vol/data/demeter/129/"
suffix=".nc" />
</aggregation>
</netcdf>
</dataset>
..
<dataset name="Total precipitation accumulated in the previous
24 hours"
ID="demeter-rain"
urlPath="demeter/rain">
<serviceName>thisDODS</serviceName>
<netcdf
xmlns="http://www.unidata.ucar.edu/namespaces/netcdf/ncml-2.2">
<aggregation dimName="time" type="joinExisting">
<scan location="/vol/data/demeter/228/"
suffix=".nc" />
</aggregation>
</netcdf>
</dataset>
..
We are able to retrieve data from the individaul NetCDF files, and also
from some aggregated variables, like 'Total precipitation accumulated in
the previous 24 hours':
http://ensembles.ecmwf.int/thredds/dodsC/demeter/rain.html
However, with some other variables like 'Geopotential':
http://ensembles.ecmwf.int/thredds/dodsC/demeter/g.html
,the download fails, and on catalina.out we get the following stack trace:
DODServlet ERROR (anyExceptionHandler): java.lang.OutOfMemoryError: Java
heap space
ReqState:
serverClassName: 'thredds.server.opendap.NcDODSServlet'
dataSet: 'demeter/g'
requestSuffix: 'dods'
CE: ''
compressOK: false
InitParameters:
DebugOn: ''
maxNetcdfFilesCached: '10'
java.lang.OutOfMemoryError: Java heap space
at ucar.ma2.ArrayFloat.<init>(ArrayFloat.java:86)
at ucar.ma2.ArrayFloat$D5.<init>(ArrayFloat.java:335)
at ucar.ma2.ArrayFloat$D5.<init>(ArrayFloat.java:327)
at ucar.ma2.ArrayFloat.factory(ArrayFloat.java:52)
at ucar.ma2.ArrayFloat.factory(ArrayFloat.java:36)
at ucar.ma2.Array.factory(Array.java:130)
at ucar.ma2.Array.factory(Array.java:79)
at ucar.nc2.ncml.Aggregation.read(Aggregation.java:604)
at ucar.nc2.ncml.Aggregation.read(Aggregation.java:659)
at ucar.nc2.dataset.VariableDS._read(VariableDS.java:277)
at ucar.nc2.Variable.read(Variable.java:618)
at thredds.server.opendap.NcSDArray.read(NcSDArray.java:102)
at opendap.dap.Server.SDArray.serialize(SDArray.java:426)
at opendap.dap.Server.CEEvaluator.send(CEEvaluator.java:275)
at
opendap.servlet.AbstractServlet.doGetDAP2Data(AbstractServlet.java:805)
at opendap.servlet.AbstractServlet.doGet(AbstractServlet.java:1626)
at
thredds.server.opendap.NcDODSServlet.doGet(NcDODSServlet.java:269)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:690)
[..]
We're starting up Tomcat with -Xmx406m, in order to give it 4 GB of
memory, but that does not seem to be enough. We've tried raising that
limit to 6 GB, which is the total memory (physicall + virtual) available
on that box, but it did not make any difference.
-Xmx406m = 406Mbytes, i assume you mean -Xmx4096m ??
Are you running 64-bit linux and JVM? Otherwise the JVM maxes out at around 2
Gb.
Are THREDDS memory requirements for aggregation higher than those
figures, or are we perhaps doing something not too clever with our setup?
the aggregation itself should not be the problem. But how big is the request?
The current opendap implementation requires the response to be built in memory.