Root Minuit Fitting Discrepancy: Mac versus Linux installation

Hello,

Let me preface and say that I am quite green at root; I have had very little experience up until about 3 weeks ago, so I apologize :slight_smile:

I have code that I am running to extract two fit parameters from a .root data file. The code works perfectly fine on any Mac OSX root installation. I’ve tried versions: 6.19/01, 6.16/00, 6.14/06, 6.14/00 on three different MacBooks running Mojave, and everything runs fine and spits out numbers that we agree with.

Because we are preparing to ramp up some simulations and analysis, we’ve set up an Ubuntu 18.06LTS server and have built and installed root there. Here is where the crux of the problem lies: We run the macro on this root installation - same exact code, same exact data - and now the fitting portion spits out this error.

FUNCTION VALUE DOES NOT SEEM TO DEPEND ON ANY OF THE 2 VARIABLE PARAMETERS.
          VERIFY THAT STEP SIZES ARE BIG ENOUGH AND CHECK FCN LOGIC.
 *******************************************************************************
 *******************************************************************************
  FUNCTION VALUE DOES NOT SEEM TO DEPEND ON ANY OF THE 2 VARIABLE PARAMETERS.
          VERIFY THAT STEP SIZES ARE BIG ENOUGH AND CHECK FCN LOGIC.
 *******************************************************************************
 *******************************************************************************
****************************************
Minimizer is Minuit / MigradImproved
Chi2                      =       73.565
NDf                       =           46
Edm                       =  1.07836e-11
NCalls                    =          196
#sigma_{mp}               =      671.619   +/-   1.37259      	 (limited)
T                         =      293.068   +/-   3.94207      	 (limited)

Covariance Matrix:

            	 #sigma_{mp}           T
#sigma_{mp} 	       1.884     0.57166
T           	     0.57166      1.1072

Correlation Matrix:

            	 #sigma_{mp}           T
#sigma_{mp} 	           1     0.39581
T           	     0.39581           1

and the values we get are in the general realm of what we want , but not what we expect. I’ve installed the same versions of root (all separately sourced in local directories and terminal instances, mind) and I get the same error and weird fit parameters. Looking into this error, it has been suggested that for some reason Minuit is finding some region where the gradient is 0 and thus that is confusing Minuit into thinking there is a minimum there, and that I should try limiting the parameter range fed to Minuit to exclude that problem region. I do, and the error goes away, but the values are even more off. I also don’t necessarily find that reasoning convincing, considering this works completely fine on one instance of root (mac) and not the other(ubuntu), so something is not-defined correctly or a library is being used somewhere where another one would be better.

IGNORE THIS ~~~Looking in some of the root documentation, I came across THIS where it suggests that there are two versions of the Tformula class and they are not necessarily compatible. I’m wondering if on the Mac root it is using the cling-based one and on ubuntu it is using the cint-based, but I am such a novice at root that I really cannot say much more and am kinda at a loss at where to proceed. ~~~~~

Looks as if CINT has been deprecated for quite awhile, so this explanation makes no sense. Which means I am even more at a loss.

Any help or pointers would be greatly appreciated!

I think @moneta can help you.

Hi,
Sorry for the late reply. We need a running reproducible program to understand the issue. The difference seems to be large, so I am expecting a problem with your code somewhere

Lorenzo