Set absolute scale for colz?

Dear ROOTers,

This question is about an addition to the Draw option ‘colz’ which maybe could be implemented (or which maybe is there already and I just missed it, I would be grateful if someone could point me to that)-

As far as I know the Draw method with the ‘colz’ option, when applied to a TH2F histogram, takes the minimum and maximum of the histogram, divides the number space in between in equal steps and assigns a color from a palette to each step. So in case the palette(1) is called, e.g. the bin with the maximum content would be drawn red, and the bin with the minimum content would be blue.

I now can think of applications where one, e.g., wants to compare theory predictions to experimental data, for different sets of theory parameters. If the difference (theory-experiment) is drawn on some TH2, it would be very convenient to have the colors of colz ‘affixed’ to an ‘absolute’ value scale of this difference. E.g., if say the theory, for one set of parameters, always overshoots the experiment, it would be handy to have the corresponding TH2 drawn only in the ‘upper’ colors of palette(1) ‘colz’, so only red gradients. Whereas if the theory, for a different set of parameters, always underestimates the experiment, the corresponding TH2 could only be drawn in the ‘lower’ colors of palette(1), only in blue gradients. I think with just calling Draw(‘colz’), all of the upper histograms would feature blue AND red gradients (min to max of each single histogram), and it would be hard to quickly compare them to each other because the number scale of the colz-axis and so also the color coding is different for each histogram.

So what I would be looking for is like a function ::SetColzRange(valuelow,valuehigh), where one could ‘hard-wire’ the scale of the color segment assignment of ‘colz’ from ‘valuelow’ to ‘valuehigh’. This function could be implemented so that if a TH2 is drawn with ‘colz’ after the above function has been called, the interval [valuelow,valuehigh] would be split into equal steps, and a color of a palette would be assigned to each step. If a bin in the histogram has a content lower than ‘valuelow’, it would just be given the color assigned to the ‘lowest’ segment, in the palette(1) case just plain blue. On the opposite end, if a bin in the histogram had a content higher than ‘valuehigh’, it would be assigned the ‘highest’ segment color, for palette(1) plain red.

If one knows a little bit of the scale to be expected when comparing two things, e.g. by a ratio, then I could think that this additional function would allow for quick and efficient 2d comparing. For instance ::SetColzRange(0.9,1.1) for an expected at most 10% difference of say the ratio of the abundancy of particles with different charges- everything smaller than one would be drawn in blue shades, even in the case that there might be nothing > 1.0 in the histogram.).

thanks for reading & cheers,

If the histograms you want to compare have the same maximum and minimum they will be drawn using the same color scale and you can compare them . To make sure they have the same maximum and minimum simply set the maximum and minimum values using SetMaximum() and SetMinimum() on the histograms.

thanks for the quick response.

Right, with ::SetMinimum(doublemin) and ::SetMaximum(doublemax) one could achieve a similar effect. The only thing problematic I could think of is that in an automated process (many different histograms), some histograms might have lower or higher bin contents than the limits doublemin and doublemax, and those bins I guess would then be shown empty.

To prevent this from happening, one could set doublemax and doublemin to ‘just enough far apart’ values, but then two things might happen: One maybe loses some ‘color resolution’ of the maybe important number phase-space of the problem (e.g. resolving the close region around a sign flip in a difference). The second thing to happen is that one might not be able to guess doublemin and doublemax to span all bincontents- to achieve this one could run a loop through all histograms determining the actual min and max bincontents of all histograms invovled and then set doublemin and doublemax accordingly. But that sounds like a bit of additional coding for the user to achieve that thing…

The method I brought up in my first post, something like ::SetColzRange(valuelow,valuehigh), would not feature these two problems, by having a good color resolution of the important interval one wants to look at (would then be [valuelow,valuehigh)) and by assigning all bins with contents outside of this interval the respective min or max color, so not losing bin content information. I could think that implementing such a method would only require little source remodelling- maybe just an additional if condition checking if the method has been called. The body-part of this condition would not take the min and max content of the histogram but just the input valuelow and valuehigh numbers. The second alteration would be to not skip bin-contents outside of the interval (valuelow,valuehigh) and just assign them the min/max color.

I didn’t look in detail into the source, so I can’t tell which difficulties could come along when implementing such a function. This post is just an idea for a handy additonal method- but maybe the implementing effort outweighs the benefit (: …

cheers & thanks for consideration,

Hi Martin,

I suggest the following ideas:
-see class THStack and examples in tutorials. A THStack will compute the min/max of all histograms in the stack.
-you can use a log scale for the z axis
-you can specify your own contour levels (see TH2::SetContour)