Clearing memory after fitting histograms


I’m currently working on a program which supposed to evaluate about 40000 histograms from one root file, first making some minor checks and then, after finding some special requirements met, making linear and gaus fits over some selected Bins (happens for ~90% of histograms).

The code looks like:

TH1F *histo;
////several other variables

for (int i =1;i<(#Histograms);i++)
////////set my predefined variables to dummy values
////////make some minor checks

     if (all previous checks ok)
                      TF1 *gfunc= new TF1("gfunc","gaus",(range));
                       ////////////take fitparameters and write them to predefined variables
	           delete gfunc;
                        TF1 *lfunc= new TF1("gfunc","pol1",(range));
                       ////////////take fitparameters and write them to predefined variables
	           delete gfunc;
        ////write my variables to a tree


The program is already running and the output looks ok, but the problem is that it occupies far too much memory. In my example, the inputfile is about 250MB, the output only some KB, however the memory used by the program increases continually, to a top value of ~630 MB. By taking out the part with the fits, I already found out, that this is the part, where my problem should be. Some collegues already suggested that my Fit TF1"s are not properly cleared from the memory.




I have a question : Why do you use new TF1 instead of stack objects ??? Yes, it wont solve memory leak problems in TF1 (if it exists), but at least you do not need to call delete - local object will be destroyed at the end of their block, it’s safer (in case of exception) and possibly faster.
May be, I do not understand something and you can use only dynamicly allocated objects ?