I am working with the TLimit class and, after some testing, I encounter a weird behavior.
Attached is the source code I am using.
Basically it just :
- creates 2 histograms “signal” and “background” filled with a random gaussian
- creates histograms “data” = signal + background
- apply TLimit class on those histograms to obtain a TConfidenceLevel
- display values such as CLsb, CLb, ExpectedCLs_b, etc…
a) I often obtain values of CLsb which are greater than 1 (for example, 16.04). However, CLsb is supposed to be a probability. Therefore it should be between 0 and 1.
b) I always obtain a value for ExpectedCLb_sb which is about 10^3 to 10^6, for example. Same as CLsb : it is supposed to be a probability, therefore between 0 and 1.
Since this is my very first time with bug reporting, I would like to make sure that this issue is confirmed by other people before submitting a real bug report.
I’m running on Ubuntu 11.10 64bits and ROOT 5.32/01.
Thanks in advance,
prog.cpp (2.95 KB)