I apologise in advance for my naive questions, but I am having some problems in understanding how to use SetErrorDef() correctly.
Basically the problem is the following: I am doing a (log)likelihood minimisation (with two free parameters) with TMinuit by calling MIGRAD. Then, I use the command SetErrorDef(5.99), since I want to get the 2-sigma contour, but I realised that changing from SetErrorDef(1) to SetErrorDef(4) changes the minimum values of the two parameters found as well. I still do not get why this happens.
So I ended up with the these questions:
- In a 2D space, is it correct to use SetErrorDef(5.99) instead of SetErrorDef(4) for a likelihood minimisation?
- Why do I get different minima values if I change from SetErrorDef(1) to SetErrorDef(4), which, as far as I understood, should change only the contours?
- Moreover, I casually called SetErrorDef() before actually calling MIGRAD and I noticed that if I call it the other way around, the minima found again change. I really do not understand why this happens too. How does this affect the results? And what should it be the correct order of calling them?
I am using ROOT 5.34 (I do not know if this makes any difference though).
Thanks a lot in advance for your help!