Check out the unix command "ulimit", using it you can increase the
maximum number of open files.
On 7 September 2010 19:10, Gregory Sjaardema <gdsjaar@xxxxxxxxxx> wrote:
> There is a sysconf call that can give you the maximum number of open files
> supported on the current system.
> The call is: "sysconf(_SC_OPEN_MAX)" which will return either the maximum
> number of simultaneously open files (including stdout, stderr, and stdin) or
> if it returns -1, then the maximum is unlimited.
>
> My guess is that your OSX is limiting you to 256.
>
> --Greg
>
> On 9/7/10 9:53 AM, Ted Mansell wrote:
>>
>> Howdy,
>>
>> I'm using netcdf4.1.2-beta1, and I get the error "Too many open files"
>> upon trying to open the 254th file (out of 616). This is on OS X
>> (10.5), and the first time I've pushed it that far on this platform.
>> On an ia64 machine, however, I've been able to open all 616 files (the
>> same files I'm trying to open under OS X; not sure which version of
>> netcdf4 I'm using on the ia64 system). Is there a setting somewhere
>> for the max number of open files? Is this an error that comes from
>> HDF5?
>>
>> I've looked through the online docs and even searched through netcdf
>> code, and I don't even see where the error value is even used (only
>> checked for in error.c). The application is compiled with 64-bit
>> memory access, and does not appear to be running out of memory (less
>> than 1.5GB when it quits).
>>
>> Thanks for any help...
>>
>> By the way, the netcdf.inc file defines ncenfile = -31, but it is
>> NC_ENFILE = -34 in the C code.
>>
>> -- Ted
>>
>> _______________________________________________
>> netcdf-hdf mailing list
>> netcdf-hdf@xxxxxxxxxxxxxxxx
>> For list information or to unsubscribe, visit:
>> http://www.unidata.ucar.edu/mailing_lists/
>>
>
>
> _______________________________________________
> netcdf-hdf mailing list
> netcdf-hdf@xxxxxxxxxxxxxxxx
> For list information or to unsubscribe, visit:
> http://www.unidata.ucar.edu/mailing_lists/