I agree this could be an useful function, although we have the TH1::Scale function. In that case it would be useful also the option to divide by 1.0/ (Integral * binWidth) in case the histogram is interpreted as a probability density function.
If you want you can create a Pull Request for this in GitHub
I’ve made a pull request (hopefully I’ve done it correctly).
I just scaled by 1/Integral rather than 1/(Integral * binWidth), because that’s what I’ve used more often in practice.
Maybe it could be modified to take an Option_t argument and do one or the other. I just wasn’t sure how to implement it for 1/(Integral * binWidth), in the case where different bins have different widths.
Hi ,
Many of the different ways mentioned in that posts are not correct, e.g. using histogram entries, only SetBinContent and not SetBinError, etc…
I think for this reason is worth having a function which calls TH1::Scale in the correct way.
There are only 2 cases:
normalise by the total counts (integral) to show the frequency probability in each bin
normalise by the total counts * bin width to show the estimated probability density function
If we want to be more precise there is also the question of underflow/overflow.
When normalizing by the total counts we could in principle include underflow/overflow, while we cannot do in the second case because we don’t know the underflow/overflow bin width.