Converting Standard Deviation from Log Scale to Linear Scale

_ROOT Version: 6.32.06
Platform: Linux
Compiler: gcc


Hi all,
I’ve been working on Energy distribution of muons that I plotted using logarithms (base 10), and I have a couple of questions regarding how to interpret and convert the results back to the linear scale.

I understand that the mean in the log scale can be converted back to the linear scale by taking the antilog (i.e., 10^(mean value)). However, I’m a bit uncertain about how to handle the standard deviation in this case.

From what I understood, it seems that we cannot simply convert the log standard deviation (SD) to the linear scale by taking the antilog in the same way we do with the mean.

Could you please clarify the correct way to interpret or convert the standard deviation from the log scale to the linear scale (GeV)?

Thanks.

Hi Raveena,

The mean of the logs is different than the log of the mean.
If you are interested in the mean energy, why not filling a control histogram with the energy values and get the estimators you are interested in there? You can always plot the X axis in logarithmic scale, if needed (you did that well in the plot you shared for the y axis)

Cheers,
D