I am writing a program which reads a ROOT file, which contains several directories, with O(10000) histograms each. These histograms are then fitted, and saved to another ROOT file, including the fits.
The problem is that the memory of the job seems to streadily grow. At first, I had a single TFIle::Write() call at the very end, thinking that ROOT would automatically optimize the management of the memory. But even adding a TFile::Write() call after the histograms in each directory are fitted didn’t change the behavior … what am I doing wrong?
TFile::Write only does the writing in the physical file of the objects belonging to the TFile (I.e. directories, trees, histograms and other object attached using Append). In particular it does not delete the object. The deletion is done by the method Close.
However, it is safe for your to delete the histograms right after the call to Write.
thanks! I was trying to go down that road, but it seems I haven’t quite gotten it right yet …
I loop over the directories in my input files, variable idir; I then create a directory of the same name in the output file, variable odir. I replaced the ofile->Write() calls after each output directorz is complete with odir-Write(); I then added
all in an attempt to delete all histogrmas currently held in memory. but the memory keeps rising, to about 600MB after the first dir, and about 1.2GB after the second.
This should have gotten rid of the Histogram objects. So I am guessing that the leak comes from somewhere else. So the next step is to really establish which objects are being accumulated.
To be able to track the allocation of TObjects, you should modify your .rootrc file - used by ROOT to setup the ROOT environment. Specifically, you want to make sure that two items are set to 1 (one):
Root.MemStat and Root.ObjectStat
Activate memory statistics (size and cnt is used to trap allocation of
thanks, that sounds like a good thing to do! I don’t use the interpreter however, but a compiled program. Can I activate these memory statistics from within that as well?
So it looks as if the fit functions are still around, even though they were stored with the histograms, which are in the directories which are destroyed? How can I get rid of them?
I recreate the fitfnc in a loop for an iterative fitting procedure, the relevant lines I think are:
TF1* fitfnc(0);
for (int iiter=0;iiter<niter;iiter++) {
fitfnc = new TF1("fit","gaus",fitrange_min,fitrange_max);
hrsp->GetListOfFunctions()->Delete(); // seems to leave old fnc in mem?
hrsp->Fit(fitfnc,"RQ0");
}
I observe that if I set niter from 3 to 1, the object table shows only a third of the TF1 objects, so the Delete() call doesn’t seem to delete the old fit functions from memory.
But even with just one iteration, the function remain in memory, despite my
thanks, the following takes care of all function but the one actually remaining in the histogram:
TF1* fitfnc(0);
for (int iiter=0;iiter<niter;iiter++) {
fitfnc = new TF1("fit","gaus",fitrange_min,fitrange_max);
hrsp->Fit(fitfnc,"RQ0");
delete fitfnc;
}
The number of TF1 histograms however was still non-zero, until I added
hrsp->Write();
TF1* f = hrsp->GetFunction("fit"); if (0!=f) delete f;
Is that to be expected? Well it appears to solve my problem at least, as all the fits now made it savely into my file, why the memory stays constant if I apply this to many directories in the same job:) Thank you all for your help!