TGraphAsymmErrors Divide

Hi ROOTers,

in the TGraphAsymmErrors, in the Divide() method I find this check:

for (Int_t b=1; b<=nbins; ++b) {

      //shall we use number of effective entries = (sum weights)^2 / sum (weights^2)
      //" + 0.5" ensures correct rounding
      if(bEffective) {
	 t =(Int_t)( total->GetBinContent(b) * total->GetBinContent(b) / total->GetSumw2()->At(b-1) + 0.5);
	 p =(Int_t)(pass->GetBinContent(b) * pass->GetBinContent(b) / pass->GetSumw2()->At(b-1) + 0.5);
	 if (p>t) {
	    Warning("Divide","histogram bin %d in pass has more effective entries than corresponding bin in total! (%d>%d)",b,p,t);
	    continue; //we may as well go on...

this, essentially, checks if the efficiency is lower than 1. The point is that for the bin “b” the weight is retrieved with

TH1::GetSumw2()->At(b-1) and I think that this retrieve the wrong weight since this is the weight of the previous (in fact “b-1”) bin…



This bug has been already fixed in the 5.28a release
(see )

Best Regards

Ah, thanks!

Since I found this not in the source of the root version I’m using but in I thought this bug as not fixed/found (or at least the bug-fix as not released).

However I see strange things related to this… I red about the possible problems with weighted histos, and since I have a weighted histo I “dumped” its content into a not weighted one in this way:

  TH1* hist1_clone = (TH1*)hist1->Clone(); 
  TH1* hist2_clone = (TH1*)hist2->Clone(); 

  hist1_clone->Reset();//to be really sure
  hist2_clone->Reset();//to be really sure

  const int nbins=hist1_clone->GetNbinsX();

  for (int ii=0; ii<=nbins; ii++) {
    double cont1=hist1->GetBinContent(ii);
    double cont2=hist2->GetBinContent(ii);

    double err1=hist1->GetBinError(ii);
    double err2=hist2->GetBinError(ii);

    hist1_clone->SetBinContent(ii, cont1);
    hist1_clone->SetBinError(ii, err1);
    hist2_clone->SetBinContent(ii, cont2);
    hist2_clone->SetBinError(ii, err2);

and however I find, for some bins where the efficiency is close to 1, the same warning from BayesDivide…