I’ve got a problem with a ROOT script, which loops over a large number of root files and creates histograms based on trees which are stored in these files.
Everything seems to be quite normal up to a number of files larger than 100.
I traced the crash down to this piece of code (combined is a tree from one of the many files, the current file is a file which should gather all plots)
cout << ++counter << endl;
TH1F* h_ncluster = new TH1F("nCluster_"+triple,"nCluster"+triple,6,-.5,5.5);
cout<< "DEBUG1" << endl;
cout<< "DEBUG2" << endl;
The output of ROOT is:
The type of crash looks very strange to me, too. Usually I only get lots of segmentation faluts by ROOT.
I think you meant to type:
combined->Draw(Form("ncluster>>nCluster_%d",triple));i.e you can not ‘simply’ add to a const char* string.
Sorry, I forgot to mention, that “triple” is a TString, wich can be added to a const char*.
The strange thing is, that my code passes all test cases, where I process less than 101 Files. Finally the Draw command for the 101st file seems to throw an excepction (see log above; DEBUG1 is written to the console, DEBUG2 is missing), which I never observed before.
What happens if you process the 101 file by itself (i.e. is the crash because you are process 101 files or is it because of the content of the 101st file?).
Also, it could be that are running out of memory?
I can process 100 times the same file without any problems, if i process this file 101 times in my loop, I get the crash mentioned above. Running out of memory is quite unlikely, unless there is some lmitation due to the memory management by ROOT. When I run my script, I can observe an increasing memory consumption (I create lots of histograms), but at the time of the crash top reports a memory usage of about 15%, additionally the computer does not start to swap.
Then you best bet is to run using gdb and/or valgrind to see where/why the crash happens.
Finally I managed to get my script running over more than 100 Files. In the initial version of my script, I’ve opened the files while looping over all files in one directory and closed all files after the loop. In the version that does not crash, I close the files within the loop.
Is there a limit of 100 files open at the same time in ROOT? The “limit” command on my SL4 box shows me a limit of 1024 descriptors, so there should not be a limit of 100 files by the operating system.
What I don’t understand as well, is the fact that the number of open files affects the ability of root to successfully draw a histogram. Finally, shouldn’t there be a warning, if a command fails due to the number of currently opened files?
I don’t quite understand either. Could you provide a complete example to reproduce the problem?
PS. But then again, it is the right thing to do, to close the file once you are done with them