BB-lite in Histfactory

Hi,

I am very new in using Histfactory but am attempting to use it in order to conduct a BB-lite type fit.

I have two templates with statistical error not normalized to the data sample. I am not sure how to properly proceed.

I have included stat error as a constraint using “ActivateStatError” and have included an “AddNormFactor” on each template in order to normalize to the data sample.

My first question is whether this is the correct way to proceed. The templates relative to one another have very different innate normalizations so their respective normalization to data will be quite different. I am not really sure how this will affect building the stat error constraining term and would really appreciate some clarifcation if this is correct or completely wrong.

My second question is concerning how to properly handle the parameter of interest. The PDF looks roughly like (f*P1)*((1-f)*P2)*constraint. I implement this by adding a NormFactor to each template that can vary between 0,1. However, these are clearly very correlated. The resulting log likelihood is essentially a straight line (problamatic?) and I wonder if this could be related.

I appreciate the help. I am a first year grad student so am quite naive at this point. I’ve pasted my code below:

RooStats::HistFactory::Measurement meas("test_method", "my test_method");

meas.SetExportOnly(1);
meas.SetPOI("frac1");
meas.SetPOI("frac2");
meas.SetLumi(1.0);
meas.SetLumiRelErr(0.001);
meas.AddConstantParam("Lumi");

//starting values for fraction of sample1 and sample2
 double startfrac1 = 0.5;
 double startfrac2 = 0.5;

//set channel
 RooStats::HistFactory::Channel chan("channel");
 chan.SetData(data);

 //scale frac1 and frac2 to data sample
  double scalefactor1 = data->Integral()/sample1->Integral();
  double scalefactor2 = data->Integral()/sample2->Integral();

//set sample1
RooStats::HistFactory::Sample samp1("samp1");
samp1.SetHisto(sample1);
samp1.AddNormFactor("frac1", startfrac1, 0, 1);
samp1.AddNormFactor("scale1", scalefactor1, scalefactor1, scalefactor1, true);
//activate gamma factors to constrain stat errors
samp1.ActivateStatError();
samp1.SetNormalizeByTheory(false);
chan.AddSample(samp1);

//set sample2
RooStats::HistFactory::Sample samp2("samp2");
samp2.SetHisto(sample2);
samp2.AddNormFactor("frac2", startfrac2, 0, 1);
samp2.AddNormFactor("scale2", scalefactor2, scalefactor2, scalefactor2, true);
//activate gamma factors to constrain stat errors
samp2.ActivateStatError();
samp2.SetNormalizeByTheory(false);
chan.AddSample(samp2);

meas.AddChannel(chan);
RooWorkspace *space = (RooWorkspace*)MakeModelAndMeasurementFast(meas);

//get RooWorkspace and model pdf
 TFile *newFile = new TFile("_combined_test_method_model.root");
 RooWorkspace *myspace = (RooWorkspace*)newFile->Get("combined");
 RooStats::ModelConfig* mc = (RooStats::ModelConfig*)myspace->obj("ModelConfig");
 RooDataSet* data = (RooDataSet*)myspace->data("obsData");

 RooArgSet params(*mc->GetNuisanceParameters(), *mc->GetParametersOfInterest());

RooFitResult* result = (RooFitResult*)mc->GetPdf()->fitTo(*data, Constrain(params), GlobalObservables(*mc->GetGlobalObservables()), Minos(true));

Hi @newbie_2918,

I guess I need some clarifications.

What exactly do you mean? What is an error that is “not normalized”?

That sounds like a BB-lite and like a luminosity scale factor. Is this what you had in mind?

Are you adding these two templates? Is the relative “size” of the two templates representing the actual probabilities of the processes? If yes, you can indeed just add them and normalise them to data using a global scale factor.

Yes, very problematic. If it doesn’t look remotely like a parabola, it will not work. I see now that you partly answered my previous question regarding whether templates are being added. Are you sure that you want to multiply the two? Isn’t it more like a*template1 + b*template2?

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.