Hello,
I’m working on subrun-level fits of time histograms from an experiment, using the Weighted Log-Likelihood Estimation (WLLE) method with:
hist->Fit("myfunc", "R0WLS");
The data are weighted histograms, and the fit function includes a decaying cosine model. I’ve been running these fits successfully for years on ROOT 6.26.10, even when some bins had zero content or zero error (especially at large times, e.g., >220 µs, where data is sparse).
However, after updating to ROOT 6.30.x, I suddenly encounter a segmentation fault when running the exact same code and input files. After investigation, I found that the crash occurs in the Poisson likelihood fit when any bin error is zero. I confirmed this by printing warnings for bins with zero error, and replacing the fit option from "WL"
to "L"
avoids the crash.
This leads to my main question:
Was the internal behavior of TH1::Fit()
with the “L” or “WL” options changed in ROOT 6.30.x to explicitly disallow bin errors of zero, resulting in segmentation faults?
I inspected the FitUtil.cxx
source, especially near L1434, and found that the likelihood calculation now heavily depends on bin errors via 1/sigma²
. It looks like this could cause division-by-zero or undefined behavior, where older versions may have silently ignored or sanitized zero-error bins.
Let me know if you’d like a minimal reproducer or more context. Thank you for your work on ROOT — it’s an essential tool for us!
Best regards,
Byungchul Yu