Defining when a fit fails

ROOT Version: 6.22/08
Platform: Ubuntu 20.04.2 LTS
Compiler: g++

I am fitting histograms with the Log-Liklihood fitting method. I am generating data sets that should represent the same physical phenomenon (radioactive decay in my case). The only thing that changes between the data sets are the statistical fluctuations of the random number generating I use. Despite this, some data sets fail to minimize while others converge. Is it possible to get more information on why specifically the minimization is failing? Additionally, is it possible to set my own condition on when a fit fails.

By the way, I have already set the max iterations and function calls a fit can do by running the code.


I think the most likely cause of the fit failure is bad initial starting point which cause the fit to not find the minimum. Another possible cause could be numerical precision. Unfortunately there are no conditions like the maximum iterations to set in this case. The suggestion I have is try to understand these failure and possibly find a way to cure them, like setting better initial values or having a more stable minimization by using a more robust function implementation, not using TH1F, etc…

Best regards