[quote]
Yes, the derivative for the fixed one are not necessary,i could probably skipped. I will see if I can easily implement it so it can be the skipped.
However, as you mention, ParameterGradient is implemented by default using DoParameterDerivative. You could simply provide a dummy implementation (for example returning zero) for the fixed parameters
Well, I know that this is not skipped for I need to debug gradient calculations deeply. But anyone who is not aware of fact of non-needed calculation of gradient will wonder why his fit takes so long
So skipping non-needed gradient requests would be nice.
Other thing is that DoParameterDerivative is not called by default. By default ParameterGradient() is called and only if it is not available, DoParameterDerivative is called. You can see this in an attached sample - “ParameterGradient” is printed for every request, while “DoParameterDerivative” is not printed at all. So in this implementation one does not know, which gradient was requested.
Yes, I know that initial values may be 0 if I do not provide any
This is not the case. See attached sample - parameters always start with 0 ( I think it was different with Fumili). It is so even if parameter limits exclude 0. And even in this sample you can see minuit warning about Chi2 points rejection, because of this 0 value parameters.
btw. what is “a log produced by Minuit”? How to obtain it?
[quote]
The parameter change in Minuit is calculated using the gradient and the Hessian. If you have a small change, is when the gradient goes towards zero and normally when you are close to the minimum.
If you have a region with zero gradient which is not the minimum, this is true could confuse Minuit. You should then set parameter values outside this region.
[/quote][/quote]
No, gradient is 0 only in minimum. However, I am not sure I understand. Let’s assume, fitter is close to the minimum. It changes parameter on the 4th decimal place, and the function value changes, getting it closer to the minimum. Than it changes parameter on the 5th decimal place, and function value changes a little. But than it changes parameter on 6th decimal place and such a small change of parameter does not change the function. Here it displays a message, that parameter change does not seem to affect the function. It is true, but this is what is expected… Do you mean that such a situation happens, when gradient changes with parameter change and function value does not change with parameter change?
Other question is: when the fitting procedure compares analytical gradients with numerical? In attached sample, sometimes the comparision is displayed, sometimes it is not displayed.
exampleGradFit.C (3.63 KB)