I want to use the FeldmanCousins class for calculating the limits on an efficiency number. However, I have some questions related to the implementation of the NeymanConstruction:
It seems that the FeldmanCousins class (which uses behind the scenes the NeymanConstruction class) uses MC toys to calculate the pdf. I wonder why this brute force is needed when it could actually calculate things analytically. I only have a (simple) pdf. Maybe it has to do to allow some kind of generalisation, but I do not quite see the reason for it. It requires quite some CPU resources, even for a very simple pdf. Moreover, when you need to know the confidence interval close a physical boundary (e.g. zero events for a Poisson), you will need to generate a huge amount of toys since the Feldman-Cousins ordering needs to include very low probabilities to get the correct answer (it does not scale with the size of the test). Therefore, I suppose that you could just as well use central confidence intervals, when generating toys.
Slightly related to that, I see that the accuracy of the poi in the NeymanConstruction is determined by the number of points to test. Here, this could be improved by implementing an adaptive search for the confidence limits. This would speed up the calculation as you do not need to scan all points of the poi.
Thank you in advance,
P.S. I’m also struggling to define a binomial distribution in a RooWorkspace factory. This line:
wspace->factory("Binomial::bion(n[0,10],k[0,10])") does not work. Any ideas are welcome.