Significantly increased memory consumption of StandardHypoTestInvDemo.C in 6.20/06 compared to 6.14/04


I noticed that running StandardHypoTestInvDemo.C in ROOT 6.20/06 leads to a much larger memory consumption compared to 6.14/04:

		| Maximum resident set size [Mbytes] 	|
N toys	|-------------------+-------------------|
		|	ROOT 6.14/04	|	ROOT 6.20/06	|
  10	|		165			|		270			|
 100 	|		310			|		865			|
 200	|		483			|	   1151			|
 500	|		991			|	   3537			|
1000	|  	   1835			|	   6866			|

The input (apart from the number of toys) is the same. The maximum resident set size was measured on lxplus with

/usr/bin/time --verbose root.exe -l -b -q StandardHypoTestInvDemo.C

How safe is it to keep using 6.14/04? I have problems running it in 6.20/06 on some platforms due to high memory consumption. Also, please note that in both 6.14/04 and 6.20/06 I use the ROOT5 version of StandardHypoTestInvDemo.C, and not the ROOT6 one (due to historical reasons) - do you think it’ll matter in the end?


Hi ,
Thank you for your report, I will check it later today, what could be the cause of this extra memory usage.
In principle, if the API has not changed, you should be able to use the ROOT5 version of the macro using ROOT6 Roostats code.
Can you also please star the workspace you have used for getting these memory values, since this might also depend on the model ? And which parameter did you pass to the macro ? (e.g. type of calculator, test statistics, etc…)


Hi Lorenzo,

thanks for looking into it! I upload a tarball with all the needed files: files.tgz (18.6 KB)
After you untar it, you do

hist2workspace ABCD_full13TeV_sample5.xml

, then you change the number of toys from 1000 to XXX (anything you want, see my table in the first post):

sed -i 's/int ntoys = 1000/int ntoys = XXX/' StandardHypoTestInvDemo.C

and run:

root -l -b -q "StandardHypoTestInvDemo.C(5)"

All the parameters I use are in this file: frequentist calculator, one-sided profile-likelihood statistics, etc.

Thanks for sharing the files. I see you have few errors in your model and the toys seem to have a problem fitting. However I can produce the memory increase between 6.14 and current master also by using the tutorial histfactory model, example_combined_GaussExample_model.root generated from example.xml existing in tutorials/histfactory.
I see in 6.14 the memory usage is stable with number of toys in that case, while in newer version is increasing, indicating a possible memory leak. I will investigate…
Thanks for reporting this




I have opened a github issue to track this memory leak introduced in 6.16.
See Memory leak when running FrequentistCalculator scan in RooStats · Issue #7890 · root-project/root · GitHub


This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.