Hello,
I noticed a funny issue when defining histograms with a large bin offset.
For example
TH1D * h_new = new TH1D(“h_new” , “h_new”, 100, 100000, 100100);
h_new->Print(“all”);
fSumw[0]=0, x=99999.5
fSumw[1]=0, x=100000
fSumw[2]=0, x=100002
fSumw[3]=0, x=100002
fSumw[4]=0, x=100004
fSumw[5]=0, x=100004
As you can see the number of bins is 100 and the histogram range has size of 100. I would expect the bin counting to run sequentially from 100000.5, 100001.5, 100002.5, etc, but as you can see it results in two points with the same x value.
When I reduce the offset, or start at 0, the problem disappears and gives me the expected behavior.
For example
TH1D * h_new = new TH1D(“h_new” , “h_new”, 100, 100, 200);
h_new->Print(“all”);
fSumw[0]=0, x=99.5
fSumw[1]=0, x=100.5
fSumw[2]=0, x=101.5
fSumw[3]=0, x=102.5
fSumw[4]=0, x=103.5
fSumw[5]=0, x=104.5
fSumw[6]=0, x=105.5
fSumw[7]=0, x=106.5
fSumw[8]=0, x=107.5
fSumw[9]=0, x=108.5
This seems like an issue with limited precision in the definitions of xlow,xup. Since this can create unexpected behavior we might want to consider either increasing the precision of xlow, xup, or printing a warning to the user so they are aware of the issue?
I should also note I am using version
ROOT 5.34/00 (branches/v5-34-00-patches@44555, Jun 05 2012, 16:18:52 on macosx64)
I have not tested on other versions of root.
Thank you in advance!