Fitting a double negative binomial

Dear Rooters
I am trying to fit a double NBD to a set of data. The fit has 6 parameters including one for normalisation. The output is very sensitive to the starting values of the parameters! So it is extremely hard to fit.

Any suggestions on how to do this and get the minimum FCN for the fit.
Thanks

My code is given below:

``````#include "TCanvas.h"
#include "TROOT.h"
#include "TMath.h"
#include "TGraphErrors.h"
//include "TGraphAsymErrors.h"
#include "TF1.h"
#include "TLegend.h"
#include "TArrow.h"
#include "TLatex.h"

using namespace std;
void DNBD()
{

gStyle->SetOptFit(1);

const char *file1[2] = {"1-pp", "2-pp"};

// x range of the distribution given in File2 and File 3
const double file2[2] = { 1,  1};
const double file3[2] = {20, 60};
//  Normalisation constant [0] in File 4
const double file4[2] = { 0.2, 1.0};

// Starting values of two parameters of the NBD
const double file5[2] = { 0.2, 0.2};
const double file6[2] = {0.2,  1.5};

// alpha value, a fractional weight for first NBD, to be found , starting values are given file7

const double file7[2] = {0.2,  0.582};

//  (1-alpha) weight factor for  second NBD with starting values of parameters in File8 and File9\.

const double file8[2] = {0.2,  0.2};
const double file9[2] = {0.2, 0.95};
//  O/P files
const char *file10[2] = {"pp2_0to2_5", "pp2_0to4_5"};

// Files to write parameter values

const char *file11[2] = {"pp2_0to2_5.txt", "pp2_0to4_5.txt"};

char name1[200]; char name2[200]; char name3[200];

for(int i=0;i<1;i++){

sprintf(name1, "%s.txt", file1[i]);
TGraphErrors graph(name1);

graph.SetTitle(" 2NBD-distrbution;Probability P(n)= 1/N*(dn/dN)");
graph.SetMarkerStyle(kOpenCircle);
graph.SetMarkerColor(kBlue);
graph.SetLineColor(kBlue);
//    graph.GetYaxis()->SetMaximum(0.15);
//graph.GetYaxis()->SetRangeUser(0,0.15);

TF1    f("f","[0]*(([3]*((TMath::Gamma(x+[1])*TMath::Power(([2]/[1]),x))/(TMath::Gamma(x+1)*TMath::Gamma([1])*TMath::Power((10+([2]/[1])),x+[1])))) + ((1.0-[3])*((TMath::Gamma(x+[4])*TMath::Power(([5]/[4]),x))/(TMath::Gamma(x+1.0)*TMath::Gamma([4])*TMath::Power((1.0+([5]/[4])),x+[4])))))",file2[i], file3[i]);

f.SetParameter(0, file4[i]); // c (normalization constant)
f.SetParameter(1, file5[i]); // b1
f.SetParameter(2, file6[i]); // eta1
f.SetParameter(3, file7[i]); // alpha fraction
f.SetParLimits(3,0.10,0.92);
f.SetParameter(4, file8[i]); // b2
f.SetParameter(5, file9[i]); // eta2
graph.Fit(&f, "ME");

ofstream outfile;
outfile.open(file11[i]);
double par_name_1 = f.GetParameter(0);
double par_name_2= f.GetParameter(1);
double par_name_3 = f.GetParameter(2);
double par_name_4 = f.GetParameter(3);
double par_name_5 = f.GetParameter(4);
double par_name_6 = f.GetParameter(5);

outfile<<par_name_1<<endl;
outfile<<par_name_2<<endl;
outfile<<par_name_3<<endl;
outfile<<par_name_4<<endl;
outfile<<par_name_5<<endl;
outfile<<par_name_6<<endl;

TCanvas* c1 = new TCanvas();
sprintf(name3, "PDF-DNBD/%s.pdf", file10[i]);
graph.DrawClone("APE");
f.DrawClone("Same");

c1->SaveAs(name2);
c1->SaveAs(name3);

cout<<""<<endl;
cout<<""<<endl;

}
}
``````

One data file is :

``````|1|0.24435||0||0.00766845|
|---|---|---|---|---|---|
|2|0.191||0||0.00403352|
|3|0.14272||0||0.00147299|
|4|0.10675||0||0.000297321|
|5|0.08027||0||0.000774919|
|6|0.06109||0||0.00124535|
|7|0.04622||0||0.00143851|
|8|0.03457||0||0.00146513|
|9|0.02609||0||0.00139442|
|10|0.0193||0||0.00135204|
|11|0.01408||0||0.00118229|
|12|0.01017||0||0.0010819|
|13|0.00723||0||0.00098995|
|14|0.00543||0||0.000830241|
|15|0.00355||0||0.000611882|
|16|0.0026||0||0.000414849|
|17|0.00178||0||0.000657647|
|18|0.00135||0||0.000294109|
|19|0.00082||0||0.000234094|
|20|0.00062||0||0.000202485|
``````

Hi @kaur,

if a fit model is extremely unstable, it either has too many parameters, so there are degeneracies in how the parameters can be chosen to achieve the same shape, or there are strong correlations between the parameters. It can also be both at the same time.
Try to reduce the order of the polynomials to make a simpler model. You can also constrain parameters to effectively reduce the dimensionality of the fit space.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.