Hi,
I'm encountering "java.lang.OutOfMemoryError: Java heap space" errors in 
aggregating netcdf files. The messages are appearing in the 
tomcat5/content/thredds/logs/threddsServlet.log and the web page will 
return a web page showing
Error { code = -1; message = "Server Error on dataset sst-aerosol-nl-aggregation"; }; 
instead of the normal OPENDAP access page ( 
http://www.ngdc.noaa.gov/thredds/dodsC/sst-aerosol-nl-aggregation.html 
). This page usually returns normally for a few requests, but once the 
OutOfMemoryError occurs the service is down until I restart tomcat.  I 
did try an upgrade to the latest THREDDS warfile, 3.16.48.0, but still 
see this error. The TDS is running on Red Hat Enterprise Linux WS 
release 3 (Taroon Update 9). The java.vm.version = 1.5.0_06-b05, and 
tomcat version is 5.5.17.
The files we are serving up are SST Aerosol products, with a Time 
coordinate, with a single time slice per file. The typical filesize is 
3.6MB. We have quite a bit of metadata as Global Attributes in each, but 
I think we can abstract that out with an NcML wrapper file. Here is the 
wrapper file, which is in the same directory with the data files:
<?xml version="1.0" encoding="UTF-8"?>
<netcdf xmlns="http://www.unidata.ucar.edu/namespaces/netcdf/ncml-2.2" >
    <attribute name="myAttribute" value="NN-attribute" />
    <aggregation dimName="Time" type="joinExisting">
      <scan location="/data1/thredds/sst/Aerosol_NN_test/" 
            suffix=".nc" />
    </aggregation>
</netcdf>
And here is the portion of the catalog.xml that uses this aggregation:
<!-- Aerosol NN, joinExisting with ncml wrapper
-->
<dataset name="SST Aerosol Aggregation NOAA-18(N)" ID="SST-Aerosol-NN-Agg"
         urlPath="sst-aerosol-nn-aggregation">
  <metadata inherited="true">
    <serviceName>multiple</serviceName>
    <dataType>Grid</dataType>
  </metadata>
  
  <netcdf xmlns="http://www.unidata.ucar.edu/namespaces/netcdf/ncml-2.2"
  location="/data1/thredds/sst/Aerosol_NN_test/Aerosol_NN_test.ncml" >
   </netcdf>
 </dataset>
Do you see any troubles with my configuration? Earlier posts to this 
list indicated that there should not be a problem with aggregating a 
large number of files. This particular aggregation is 245 files. I 
definitely get errors right away if I also set up separate but similar 
aggregation with 135 files in the same catalog.
Thanks for any suggestions,
Ken
--
= Enterprise Data Services Division ===============
| CIRES, National Geophysical Data Center / NOAA  |
| 303-497-6221                                    |
= Ken.Tanaka@xxxxxxxx =============================