I am using MINUIT through ROOT::Fit::Fitter in order to minimize a chi2 I defined myself (based on the fitCircle_8C example), but I am having some problems estimating the real value of my parameter errors. From doing several tests, the “correctness” of my errors depends greatly on the value of precision I provide using ROOT::Math::MinimizerOptions::SetPrecision(). I have looked in many places for the exact definition of “precision”, but there is not a lot of documentation about it, and what I found is in general not very clear about this particular parameter.
Could you kindly explain how is the precision used by MINUIT?
You should use the default value of SetPrecision, which is the numerical precision (~ 10^-16).
Using other values could make the minimization not converging.
Precision defines the numerical precision used to compute the function to be minimized (your chi2 function)
I have indeed tried that, but my error gets badly estimated. The error should be ~ 1.9 10^-3, but a precision of 10^-16 give an error of 7.3 10^-6 (using either MIGRAD or MINOS). The problem (I think) is that my chi2 plot has rather large fluctuations, so small numerical precision seem to get MINUIT stuck in one of these fluctuations while looking for #Delta #chi^2 = 1 (I have also tried with larger values of UP, but the value of my error is still dependent on the precision).
Could you maybe provide a small numerical example to see explicitly how precision is used by MINUIT?
Is it something of the sort: smallest possible step = initialStep x precision?
In case it is helpful, I am attaching here a #chi^2 scan I obtained manually. The parabola was fitted using simply TGraph->Fit(), and it shows what I should find with my minimizer setup.
Looking at the plot I see very large fluctuations in your Chi2. These are at the level of Delta(Chi2) = 1 or even larger. I don;t think these are real local minima of the chi2, so it is probably a numerical problem.
In that case Minuit cannot really help and you cannot use the Delta(Chi2)=1 rule.
The value you will get also fitting a parabola will not be reliable.
Also there is very little experience for changing the precision in Minuit. I am not sure everything will work correctly if you change from 10^-16 to 10^-8.
So, probably the only possibility for you to estimate uncertainty is running simulation and repeat the experiment and see the obtained variation of the estimated parameters.
Indeed these fluctuations are not local minima, but due to the nature of my study, I cannot avoid them. In principle, the “right” thing that the MINUIT minimization should find should be similar to what I fit with the parabola (in this case I can do this check because there is only one free parameter, but eventually I will move to many variables).
I have tried modifying the definition of error to look for e.g. Delta(chi2)=100, and to scale the chi2 by some factor (e.g. chi2/100), and then to re-scale the errors obtained. With either of these methods the results are pretty consistent for precision of 10^-2, but quickly fails for other values. Leaving the precision as default makes the fit fail (huge EDM, ~10^8), and explicitly setting precision=10^-16 completely misestimates the error (2 10^-5).
Do you happen to know of any other way to make the minimization less sensitive to these fluctuations, and somewhat “smoothen” the chi2 that MINUIT sees?
The precision is crucial in Minuit for the computation of the gradient and finding the optimal step size. From a correct gradient you obtain a correct minimisation.
So , another possibility could be that you compute the gradient in a different way (e.g. automatic differentiation or analytically) and provide them to Minuit.
Thank you for the suggestion. I have compared the derivative values estimated by Minuit with the analytical ones (obtained with the fitted parabola), and these are reasonably similar. In fact, generally I have not issues finding the minimum with Minuit, my minimizations converge with a reasonable EDM for different values of precision. The problem is that the estimated error does not correspond to what can be estimated from the plot I attached before, even when setting precision=-1 (default according to the manual).
I have also tried changing the error definition with SetErrorDef(100) so that it looks for Delta chi^2=100, and scaling the error to 1-sigma as error(def=1) = error(def=100)/Sqrt(100), but the results do not correspond at all with the chi2 plot (minuit would give ~10^-6).
Is an additional command needed to force Minuit look for Delta chi^2 = 100?