Hi Lorenzo,
There was a discussion before for large numbers in counting experiment. Your suggestion was to re-define the parameters (signal and background) close to 1. I am having similar issue but not sure how to handle this while dealing with RooDataHist.
Say, I have one histogram (TH1) for signal and another for background. The X axis of these histograms are energy and Y axis is counts (/kg/day/keV).
I´ve set ‘Energy’ as RooRealVar; ‘Energy’ is an observable.
Energy.setBins(100);
w->import(Energy);
w->defineSet(“obs”,“Energy”);
// P.O.I //
RooRealVar mu_sig(“mu_sig”,“signal …”,-100,5000) // Intentionally kept this huge range as expected signal counts is a large number
w->import(mu_sig) // w is the RooWorkSpace
w->defineSet(“poi”,“mu_sig”)
// Nuisance parameter //
RooRealVar mu_b(“mu_b”,“background”,0,50000)
w->import(mu_b)
w->defineSet(“nuis”,“mu_b”)
// Global observable //
RooRealVar a_b(“a_b”, “Expected backgrounds”, 1)
w->import(a_b)
w->defineSet(“gobs”,“a_b”)
// Signal Model //
hEnergy_sig … This is a TH1D histo which shows the signal histogram (bin width = 1 keV)
signal_rate = hEnergy_sig->Integral() * FiducialMass * ExposureDay
RooRealVar signal_counts (“signal_counts”, "signal counts expected ", signal_rate)
w->import(signal_counts)
// Create RooFit model PDF //
RooDataHist DataHist_sig(“DataHist_sig”,“signal RooDataHist”,*w->set(“obs”),hEnergy_sig);
RooHistPdf HistPdf_sig(“HistPdf_sig”,“signal RooHistPdf”,*w->set(“obs”),DataHist_sig);
w->import(HistPdf_sig);
/**************
- if Fiducial Mass = 1000 kg, ExposureDay = 1000 days then signal_counts ~ 10^6 …a huge number
/**************
// create background model //
hEnergy_bg : it is the background histogram (bin width 1 keV)
RooDataHist DataHist_bkg(“DataHist_bkg”,“bkg RooDataHist”,*w->set(“obs”),hEnergy_bg);
RooHistPdf HistPdf_bkg(“HistPdf_bkg”,“bkg RooHistPdf”,*w->set(“obs”),DataHist_bkg);
w->import(HistPdf_bkg);
bkg_rate = hEnergy_bg->Integral() * FiducialMass * ExposureDay
w->var(“mu_b”)->setVal(bkg_rate);
w->var(“a_b”)->setVal(bkg_rate);
RooRealVar sigma_b(“sigma_b”,"sigma ", 0.1*bkg_rate);
w->import(sigma_b);
w->factory(“Gaussian::Constraint_b(mu_b,a_b,sigma_b)”);
/**************
- if Fiducial Mass = 1000 kg, ExposureDay = 1000 days then bkg_rate ~ 10^4 …a huge number, and sigma_b also high ~10^3
/**************
//>> Event probability model
w->factory( " SUM::EventModel( mu_sigHistPdf_sig , mu_bHistPdf_bkg) " );
//>> Constraint functions
w->factory( " PROD:: Constraints(Constraints_b) ");
//>>Select probability model
w->factory( " PROD::TotalEventModel(EventModel,Constraints) ");
//>>Create configuration file
// SIGNAL plus BACKGROUND model
ModelConfig* SBmodel = new ModelConfig(“SBmodel”, w);
SBmodel->SetName(“SBmodel”);
SBmodel->SetPdf(*w->pdf(“TotalEventModel”));
SBmodel->SetObservables(*w->set(“obs”));
SBmodel->SetParametersOfInterest(*w->set(“poi”));
SBmodel->SetNuisanceParameters(w->set(“nuis”));
SBmodel->SetGlobalObservables(w->set(“gobs”));
RooRealVar firstPOI = (RooRealVar) SBmodel->GetParametersOfInterest()->first();
firstPOI->setVal(0.1);
SBmodel->SetSnapshot(*firstPOI);
w->import(*SBmodel);
Here, main idea is to get the sensitivity study assuming we understand the background.
Problem is when taking fiducial mass 1000 or 2000 kg and 1000 days exposure time in actual experimental view, then expected number of signals and backgrounds are too high. So, while scanning over mu_sig (POI), I get following message for every try of mu_sig: Minuit2Minimizer::Minimize : Minimization did NOT converge, Edm is above max
So, at the end errors (1 sigma or 2 sigma) of the median value is also very high.
Now, as you have pointed out before this is a numerical problem which causes this fit failure.
But I am not sure how to re-define my parameters (signal and background) close to 1 with a constant scaling factor.
Regards,
SP