Least square with orthogonal polynomials

Dear ROOT Experts,

I’ fitting an histogram with a polynomial.

In order to make the fit more stable and to get an uncorrelated estimation of the parameters I would like to use a base of orthogonal polynomials.

I would like to ask if ROOT contains an implementation of the algorithm that calculates the set of orthogonal polynomials corresponding to a “set of measurements” (X_i of the histogram).

I browsed the class documentation but I could not find it,
can somebody help please?

Many Thanks


I am not sure I have fully understood what you would like to do. Do you want to a least square fit to the data with an orthogonal set of polynomials or perform a smoothing (a non -parameteric regression) ?

In the first case we have in RooFit two classes RooChebychev and RooBernstein which contains implementations for the ChebyChev or Bernstein polynomials which can then be used for fitting.

Best Regards


see also class TMultiDimFit and tutorial $ROOTSYS/tutorials/fit/multidimfit.C


Dear Lorenzo and Rene,

thank you very much for your replies.

For how I understood the problem, fitting an histogram with a set of “orthogonal” polynomials as the Chebychev or the Legendre ecc is not curing the fact that the covariance matrix of the fit is not diagonal. The reason for it is that these polynomials are orthogonal when integrated over a range with respect to their weighting functions, but they are not “locally” orthogonal, when calculated in the x_i point of the histogram.

I read that the theoretical more recommended way to extract the fit parameters in this case is to find the set of polynomials that are orthogonal with respect to the x_i. This allows to solve the normal equation of the linear least squares easily and gives a diagonal covariance matrix. Maybe this is called non -parametric regression?

I read the multidimfit.C fit tutorial, but I could not find what I mean, that is to fit with a linear combination of orthogonal polynomials, not with a linear combination of functions, each of them expressed in terms of a combination of polynomial. Maybe I missed a possible application of the MultiFit package?

Many Thanks



can you please give a reference paper or book on what you mean as “least square with orthogonal polynomial” ?

The TMultiDimFit does a regression analysis to find the optimal parametrization of the dependent variable Y in term of the independent variables X using an optimal linear combinations of polynomials. WHat the class does is well explained in the class reference documentation root.cern.ch/root/htmldoc/TMultiDimFit.html

Cheers, Lorenzo

Dear Lorenzo,

here are some references:

  • R.J.Barlow, “Statistics”, WILEY 1999 :: Section 6.6 “Linear least squares and matrices”, Subsection 6.6.2 “Higher polynomials” (in my edition page 114-> 115)

-W.T.Eadie, D.Drijard, F.James, M.Roos, and B.Sadoulet, “Statistical Methods in Experimental Physics”, North-Holland, 1971 :: Section 8.4 “The least square method”, Subsection 8.4.2 “The polynomial model” (in my edition page 165-> 166)

  • Derek J. Hudson, “STATISTICS LECTURE II”, CERN Report 64-18, 1964 :: Section “ORTHOGONAL POLYNOMIALS” page 181-> 186

Eventually this is the same that the Mutidimfit does, even if it seems to me that the Gram-Schmidt orthogonalisation is used only to reduce the number of “orders”, once the fit is performed, not in order to find the functions to use to perform the fit…

Many thanks for your time

Hi Paola,

Thank you for the references. I have understood now what you are looking for. We don’t have such algorithms for finding orthogonal polynomial available in ROOT, although something similar is used inside TMultiDimFit to reduce the number of functions in the final expression.
However, I think, using an algorithm like Minuit for linear fitting should work fine, also when the parameters are correlated and you might have problems inverting directly the matrix as it is done in the TLinearFitter.

Best Regards