Fitting the classic y=a+b*x given a series of points (x_i,y_i) seems quite straight-foward.
How to tune the fit to consider associated errors, that differ for each x_i and y_i?
Example: given 4 colums

To my knowledge, this does not seem to answer the question… those classes are for making graphs and show errors relative to data points, but do not provide any fitting method… it’s just about visualization

When you look at a class documentation you should also look at the inherited methods. As @Wile_E_Coyote pointed the Fit method is in TGraph. A TGraphErrors is a TGraph so it has the TGraph methods also. Open “Public member functions inherited from TGraph” on this page.