I am interested in knowing that for e.g. running 100 jobs with 1K toys, saving the
HypoTestInverter::GetInterval() for each job into a root file and adding each of these root files together. Would this be equivalent to running 1job with 100K toys.
I have tried to test this with running on a simplified model, 100 jobs with 1K each and one job with 100K toys. It looks like it gives the same answer. Is this an acceptable way to run these limits to speed up the jobs?
For additional information, I am using the
FrequentistCalculator() with the
ProfileLikelihoodTestStat() on a simple poisson model with one gaussian constraint.