# Formula plotting using TF1 - Issue with parameters

I have a function which I want to plot. It’s actually an integral between z_min and z_max plotted as a function of lambda. However, I have written the function explicitly (see code).

double g1_z_lambda(double *x, double *par){
double lambda = par  [0]; //variable
double z_min  = x[0]; //parameter
double z_max =  x[1];

double G = (lambda +1)/(lambda - 1);

double J_1_max =(1/4)*log(abs((G+z_max)/(G-z_max)));
double J_2_max = ( z_max/(2*(G*G - z_max*z_max)));
double I_0_max =( (1/G*G*G)*J_1_max + (1/G*G)*J_2_max );
double I_1_max = ( -(1/G)*J_1_max + J_2_max );
double I_2_max =  ( -3*G*J_1_max+G*G*J_2_max+z_max );

double J_1_min =(1/4)*log(abs((G+z_min)/(G-z_min)));
double J_2_min = ( z_min/(2*(G*G - z_min*z_min)) );
double I_0_min = ( (1/G*G*G)*J_1_min + (1/G*G)*J_2_min );
double I_1_min = ( -(1/G)*J_1_min+ J_2_min );
double I_2_min =  (-3*G*J_1_min+G*G*J_2_min+z_min );

double g1 =  (1+ 2*lambda + 9*lambda*lambda + (6*lambda*lambda -2)*I_1_max + (lambda-1)*(lambda-1)*I_2_max) - (1+ 2*lambda + 9*lambda*lambda + (6*lambda*lambda -2)*I_1_min + (lambda-1)*(lambda-1)*I_2_min);

cout << "g1_lambda: " << lambda << " " << z_min <<" " << z_max << " " << g1 << endl;
return g1 ;
};

void Integrations2() { ///OPENING BRACE

double z_min[] = {0.0, 0.2, 0.4, 0.6, 0.8};
double z_max[] = {0.2, 0.4, 0.6, 0.8, 1.0};

const int NumMinZValues = sizeof(z_min)/sizeof(z_min[0]);  //To get the correct length of the array. Required for the for loop

TCanvas *c1 = new TCanvas ("c1", "c1", 700, 500);

//F1
TF1* g1 = new TF1("xg1_lambda", g1_z_lambda, 1.001, 1000.1, 2);  //x axis variable
g1->SetParName(0, "z min");
g1->SetParName(1, "z_max");

for(int i=0; i < NumMinZValues; i++){

g1->SetParameter(0, z_min[i]);
g1->SetParameter(1, z_max[i]);
g1->SetTitle("F1 Integrated for various z ranges; Lambda; F1");

g1->SetLineColor(2+i);
TString options = (i>0 ? "SAME C" : "C");
g1->DrawCopy(options);

TLatex SB_LS;
SB_LS.SetNDC();
SB_LS.SetTextSize(0.04);
SB_LS.SetTextColor(1);
SB_LS.DrawLatex(0.5,0.835, TString::Format("#color[2]{0.0-0.2}\n #color[3]{0.2-0.4} \n #color[4]{0.4-0.6} \n #color[5]{0.6-0.8} \n #color[6]{0.8-1.0}"));

};

However, it seems to be doing something funny with z_min and z_max when I look at the cout statement. The integral should be calculated between z_min and z_max and then plotted as a function of lambda.

double lambda =   x[0]; // variable
double z_min  = par[0]; // parameter
double z_max  = par[1];

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.