I am trying to perform an unbinned maximum likelihood fit (extended) to a weighted MC dataset, but the resulting fit parameter uncertainties are much larger than they should be. When no limits are placed on the parameter of interest during the fit, the uncertainty can be several times the value of the parameter itself. The command used is:
When I perform the same fit to the unweighted data (with SumW2Error set to False), the resulting uncertainties are substantially smaller, despite a negligible change in the shape of the distribution.
A very similar issue has been reported here:
There is a brief suggestion of using the event weight distribution itself, but it’s not very clear to me how this would be done.
I understand that an approximation is used to estimate the uncertainties for ML fits to weighted data, but is there a regime in which it is not valid? Is MINOS still not possible to use for these types of weighted fits?
This is very strange, because if your weights are all significantly less than one, the number of effective entries will be larger than the simple sum of the weights. Therefore the correction applied when you use RooFit.SumW2Error(True), would make the error much smaller than when not applying it.
Are you sure the correction is applied ?
If this is the case, I would need a macro and the ROoWorkspace showing this problem
Many thanks for the reply. I think I’ve tracked down the issue. Indeed, you are right - the correction was not actually being applied. Here’s what I somehow missed in the output of the fit that says as much:
It turns out that I am keeping several of the parameters in my fit constant via some lines like this:
If I instead explicitly set the variables constant with:
then the fitting procedure doesn’t take these parameters into account when determining the correlation matrix and I don’t get the error mentioned above. Now the SumW2Error correction is being applied properly and the uncertainties on the remaining free parameters look great. Sorry this came down to such a silly issue on my end. Thanks again!