Unknown normalisation factor of tf1 when fitting a normalised to 1 histogram

Hello,

I’m having troubles when fitting a normalised (to 1) tf1 function to a normalised (to 1) th1f histogram.

My function is normalised by definition since the normalisation constant itself depends on the parameters of the function, but the Integral of the normalised function from x_min to x_max is 1 independently from the value of the parameters.

It seems that when I’m using this function to fit a normalised histogram, the tf1 normalises the function with some hidden normalisation constant that seems to depend on the number of bins of the histogram I’m fitting and x_max - x_min. Is it possible? and If yes how do I find out what this normalisation constant is and from what it is computed?

Thanks,
Davide

EDIT

Digging in TF1 source code I read about the fNormalized boolean variable and SetNormalized(True). Once I executed those commands in my python fitting file I run IsEvalNormalized() to check wether my function is evaluated as a normalized function. Why do I get

TF1::IntegralOneDim:0: RuntimeWarning: Error found in integrating function f1g in [0.000000,1.000000] using AdaptiveSingular. Result = nan +/- nan - status = 11

when the tf1 function is a simple (Norm([0],[1]))*[0]x^2 + [1] function?

Hi,
Yes you can use the normalisation flag, but you need to add an extra parameter if you are fitting also the normalisation (e.g. doing an extended fit)
The result is maybe nan because the parameters might be at a wrong value (e.g. all zero),. I would check this

Lorenzo