I suppose this is a more theoretical question about the numerical inversion of nearly-singular symmetric matrices, but it does use ROOT and may be a guide for some.
I have a symmetric matrix that is an instance of the TMatrixDSym class, and I am inverting it using the TDecompChol class to do the decomposition pre-inversion.
My matrix is a covariance matrix and currently the off-diagonal elements are of the same order of magnitude as the diagonal elements. The eigenvalues retrieved using the TMatrixDSymEigen class are small (on the order of 10^-6 or 10^-8), but they are above the tolerance factor so the inversion succeeds. The determinant is calculated to be 6.12371e-65 (matrix nearly singular).
The inverse correlation matrix has some elements with very large values (both positive and negative, on the order of 10^5, 10^6) when the original matrix has elements of the order 10^-4.
When I try to compute a chi-squared value using the generalized formula:
Chi-2 = (data- theory)^T * Cov^-1 * (data - theory)
the chi-2 values are too high by 4 or 5 orders of magnitude (the details of this are much deeper than need to be described here). When I alter the symmetry of the correlation matrix, the inverse correlation matrix and the chi-2 values seem well-behaved.
This seems like a theoretical issue: are things not mathematically well behaved for symmetric matrices with very small determinants, or with symmetric matrices where the off-diagonal elements are on the same order of magnitude as the diagonal elements? Could this be a breakdown of one of the assumptions behind the chi-2 formula? Or is this an issue with the inversion algorithm?
Any help or advice is greatly appreciated.
(I’m using ROOT 6.08/02 on MacOSX 10.12.6.)