Subtracting histograms: entries after Sumw2

Hello,
I see that when I add two histograms with TH1::Add() after calling TH1::Sumw2() the entries are set to the sum of the entries, whereas when I subtract them the entries are far fewer. I have already read the same question in this post:

however I still don’t understand why the two cases are treated differently.

I have read the code of TH1::Add() and wonder why statistics are reset if the coefficient of the second histogram is negative (why is there a risk of getting negative variances?), and why the resulting histogram is considered as weighted, thus using effective entries instead of “normal” entries in the stats box, which also affects mean error and standard deviation error. I see that a double called nEntries is calculated, but SetEntries(nEntries) is called only if the histograms are added, not if they are subtracted:

1083 // - Add statistics
1084 Double_t nEntries = TMath::Abs( c1* h1->GetEntries() + c2* h2->GetEntries() );
//[...]
1087 // statistics can be preserved only in case of positive coefficients
1088 // otherwise with negative c1 (histogram subtraction) one risks to get negative variances
1089 // also in case of scaling with the width we cannot preserve the statistics.
//[...]
1095 Bool_t resetStats = (c1*c2 < 0) || normWidth;
if (resetStats) {
1185 // statistics need to be reset in case coefficient are negative
1186 ResetStats();
1187  }
1188 else {
//[...]
1191 SetEntries(nEntries);
1192  }

My histograms are unweighted but I used Sumw2() to add bin errors in quadrature, so that after fitting the resulting histogram the reduced chi-square is close to 1.
I can use hdiff->SetEntries(h1->GetEntries()-h2->GetEntries()); however, when mean error and rms error are calculated, effective entries are still used instead of entries and I see no SetEffectiveEntries() method.
Thank you in advance for your help.


ROOT Version: 6.18/04
Platform: Ubuntu 19.04 x86_64
Compiler: g++9

1 Like

Hello @kimd99,

I would suggest that @couet looks into that next week.

1 Like

I think this is more a question for @moneta

1 Like

Hello everyone,
is there any news on this matter?

1 Like

@moneta: Do you have an idea about it ?

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.

Hi,

First of all I think the case that the number of entries is different when using addition or subtraction of unweighted histogram is a bug. The thing is that in the first case (addition) underflow/overflow are considered while in the second case (subtraction) the underflow/overflow are not considered.
I have created a new JIRA item, ​https://sft.its.cern.ch/jira/browse/ROOT-10567 for this bug

Then I think is correct that after subtracting the histogram is weighted if you would like to get correct error computation in the bin content. The error in the bin should not be sqrt(n1-n2) but sqrt(n1+n2) if n1 and n2 are coming from independent samples.

The other point you are mentioning is the computation in the error on the mean for a weighted histogram. There it is true it might be not fully correct using the effective entries, but probably also using the raw entries could be problematic. I will investigate this in more detail, however if you are interested in a correct estimation of the error on the mean, I would reccomend you to use a bootsrap procedure. Simulate many histograms and from them compute the error from the spread of the obtained values.

Best regards

Lorenzo

I have checked using pseudo-experiments the estimate on the mean error. I found also another formula from Cochran (1977) mentioned in this paper:
https://doi.org/10.1016/1352-2310(94)00210-C

In case of computing the mean error from the subtraction of two histograms the pull distributions obtained look reasonable either by using the formula in ROOT ( Variance/(Number of effective entries) or the formula mentioned above.

For reference see https://stats.stackexchange.com/questions/25895/computing-standard-error-in-weighted-mean-estimation

I attach also a macro that checks the computed error on the mean

Lorenzo

test_MeanError.C (2.2 KB)