Function sensitivity and color scale

Hi,

my problem is the following: I want to plot the same TF1 several times by changing the parameter values, in order to look for the sensitivity of the function to the different parameters.

I defined a palette and I can use it to draw the function n times with different colors, but now I would like to know if it would be possible to draw a color scale together with the function, or even in a different pad. This would help to understand the plot.

Does anyone have any suggestion?

Thanks a lot

How ?

Hi,

this is how I defined the palette, basically I followed the the example of the tutorial:

[code] static Int_t colors[20];
static Bool_t initialized = kFALSE;

Double_t Red[3] = { 1.00, 0.00, 0.0};
Double_t Green[3] = { 0.00, 1.00, 0.00};
Double_t Blue[3] = { 0.00, 1.0, 1.00};
Double_t Length[3] = { 0.0, 0., 1.00 };

if(!initialized){
Int_t FI = TColor::CreateGradientColorTable(3,Length,Red,Green,Blue,20);
for (int i=0; i<20; i++) colors[i] = FI+i;
initialized = kTRUE;

}

gStyle->SetPalette(20,colors);

[/code]

In that case you can use:
root.cern.ch/root/html/TPaletteAxis.html

Ok, thanks, this seems to be exactly what I need.
Is it by defining a TH2F the only way to declare a TPaletteAxis?
Actually I simply have a TF1, but then GetListOfFunctions() is not a member of the class.

This class is used in THistPainter to Draw the palette. It uses a TH1 to define the min and max of the axis (see help).

I report a simplified version of the script to let you better understand the point:

static Int_t  colors[20];
  static Bool_t initialized = kFALSE;
  
  Double_t Red[3]    = { 1.00, 0.00, 0.0};
  Double_t Green[3]  = { 0.00, 1.00, 0.00};
  Double_t Blue[3]   = { 0.00, 1.0, 1.00};
  Double_t Length[3] = { 0.0, 0., 1.00 };
  
  if(!initialized){
    Int_t FI = TColor::CreateGradientColorTable(3,Length,Red,Green,Blue,20);
    for (int i=0; i<20; i++) colors[i] = FI+i;
    initialized = kTRUE;
    
  }
  
  gStyle->SetPalette(20, colors);


  TF1 *decay_i = new TF1("decay_i","[0]*exp(-[1]*x)", 0., 400.);
  decay_i -> SetParameters(fr_i, ki);

  decay_i -> SetLineStyle(2);
  
  TCanvas *c1 = new TCanvas("c1","Damage Fractions",200,10,700,500);
  
  fr_i = 0.95; 
  fr_c = 0.05; 

  for(int ii=0; ii<19; ii++){
    
    decay_i->SetParameter(0, fr_i);

    decay_i->SetLineColor(colors[0]+ii);

    if(ii==0){
      decay_i -> DrawClone("l");
    }      
    else{
      decay_i -> DrawClone("l same");
    }      
    
    fr_i = fr_i - 0.05;
    fr_c = fr_c + 0.05;
    
  }

Ok, I have a look to this method.

Well, I’m trying to adapt the info I got and the example to my macro, but I still have some troubles. For example, for some reasons which I didn’t get so far, ROOT crashes when I use the command:

palette -> SetYNDC();

and of course, if I skip to set the coordinates, then I don’t get the palette on the axis.

Could you explain me way?

Thanks!

You should force the palette Drawing before accessing “palette”.
Do gPad->Update() before.

Ok, it seems to work now. How can I set the palette axis in such a way that the lowest and highest values are the extreme values of my parameters?

The extremes of the palette axis are the min and max of the associated histogram.