I wish to place upper limits on an exotic physics process. My input data is a 1-D histogram of the mass of my candidates. I parametrize this histogram by the sum of a polynomial PDF representing the background, plus a gaussian PDF (at a given mass) representing my signal.
I calculate the limits using the RooStats::FeldmanCousins. However, after reading the documentation root.cern.ch/root/html/RooStats_ … usins.html , I still have several questions:
The documentation does not explain what the option FeldmanCousins::FluctuateNumDataEntries(bool) does. Am I correct that if the number of candidates would fluctuate if I repeated the experiment, then I should set this to true, but if it would always be the same (not true in my example), then it should be false ?
I declare my “parameter of interest” (which is the process’s cross-section) as follows:
xsec = r.RooRealVar(“xsec”,“xsec”,1.,0.,100.)
I find that the upper limit calculated by FeldmanCousins depends strongly upon the allowed range I specify for this parameter (0. – 100. here). If I specify a very large range (0. - 100000.), then the precision of the upper limit calculation is poor. On the other hand, if I specify a very small range (0. – 0.001), then the returned upper limit starts to go down, apparently because I am truncating the distribution.
I therefore find that the upper limits calculated by FeldmanCousins can easily go wrong. To make this worse, if I do specify too small a value for the upper range of xsec, FeldmanCousins does not issue any warning.
- I wish the normalization of the background PDF to float free, so it can be constrained by data, without any apriori assumptions. I do this by declaring the normalisation of the background PDF to be a “nuisance parameter”, and assign it extremely large apriori uncertainties. Are the IntervalCalculators provided in RootStats all robust in this case where a nuisance parameter have very large apriori uncertainties ?