Linear regression fit

Fitting the classic y=a+b*x given a series of points (x_i,y_i) seems quite straight-foward.
How to tune the fit to consider associated errors, that differ for each x_i and y_i?
Example: given 4 colums

x_0 y_0 sigma_x_0 sigma_y_0
x_1 y_1 sigma_x_1 sigma_y_1

x_n y_n sigma_x_n sigma_y_n

finding the best fit ( with Pearson X**2 … ?)

Thanks for the help


To my knowledge, this does not seem to answer the question… those classes are for making graphs and show errors relative to data points, but do not provide any fitting method… it’s just about visualization


When you look at a class documentation you should also look at the inherited methods. As @Wile_E_Coyote pointed the Fit method is in TGraph. A TGraphErrors is a TGraph so it has the TGraph methods also. Open “Public member functions inherited from TGraph” on this page.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.