It is obvious that for a large data set with very less errors on each points, the fit would return high chi-square value (FCN), if the pre-defined function is OK. But, always it is not possible to have very large error bars to reduce the chi-square value.
(1) Is there any way out in that case?
(2) Is it possible to include X-errors while fitting a 3D dataset (a TGraph or TGrapherrors)?
(3) Is it possible to take the projections of my 3D fit on YZ and XZ plane.
I think what will help here is the difference between chi^2 and the chi^2 probability. Chi^2 is just the squared difference between function and data point, weighted by error. With more data points, chi^2 generally grows. The chi^2 probability, however, tells you how likely you have a bad or a good fit, and it takes into account the amount of data points. See e.g. here:
The number of degrees of freedom is the number of data points - number of fit parameters. Probabilities very close to zero indicate bad fits.
Now some comments to your questions. Some of these things might be unnecessary if you use chi^2 probabilities, though.
Not necessarily. If the model describes the data more accurately, chi^2 is lower. But it’s true that if the model is “good enough”, you in general get higher chi^2 with more data points. That’s not a problem, though, as I explained above.
Yes, the chi^2 probability.
I believe there’s no TGraph in 3D, but the histograms such as TH3D can be fit. They don’t take into account x errors, though, they take into account how many entries per bin you have.