# Integration

I have a question concerning the following problem:

I have a RooAbsPdf func , depending on some RooRealVars a,b,c,d.
What I want is a pdf, that is the Integral of func with respect to a,b,c, i.e. a projection on d
How do I proceed, especially in case of many parameters, lets say seven to integrate over.
And what about the normalization. Will the pdf func be taken to be normalized in the range of the RooRealVars I specified?

kind regards Markus

Hi Markus,

If you want an intergral over a pdf that is itself a pdf again you
should use the createProjection method

RooAbsPdf* projPdf = pdf.createProjection(intObs) ;

where intObs is the observable you want to integrate over (or
a RooArgSet with multiple observables you want to integrate over).
The projected pdf has, like a regular pdf, no fixed concept of what
its normalization observables are, but will do the correct thing
when used. E.g. given F(a,b,c,d) and projF = F.createProjection(d),
use of projF with a dataset D(a,b,c,d) will define projF as

`````` int F(a,b,c,d) dd / int F(a,b,c,d) da db cd dd
``````

How the (multi-dimensional) integrals above are calculated depend on what F is. If it is a RooProdPdf it is often possible to factorize the calculation on lower-dimensional integrals. If this is possible, this is done automatically. Those integrals are than calculated analyticall (if supported by the respective pdfs) or numerically.

Wouter

Thank you and sorry for the late reply

I tried it and I’m still having some problems, maybe my model is too complicated. I want to create the projection and integrate the resulting pdf in a certain range. Here is my code:

#include “RooGlobalFunc.h”
#include “RooFit.h”
#include “RooStats/RooStatsUtils.h”
#include “RooPlot.h”
#include “RooPoisson.h”
#include “RooAbsPdf.h”
#include “RooAbsReal.h”
#include “RooRealVar.h”
#include “RooAbsPdf.h”
#include “RooDataSet.h”
#include “RooProdPdf.h”
#include “RooGaussian.h”

using namespace RooFit ;
using namespace RooStats ;

int main(){

// define input

Double_t n_main= 25.48;
Double_t n_sb1=5.8;
Double_t n_sb2=2.209302325581395349;
Double_t tau1=0.9090909090909090909;
Double_t tau2=0.5813953488372093023;
Double_t sigma_sideband1=0.053;
Double_t sigma_sideband2=0.081;
Double_t sigma_tau1=0.15;
Double_t sigma_tau2=0.08;

//define variables

RooRealVar nobs_main(“nobs_main”,“nobs_main”,n_main,0,40);//main_measurement
RooRealVar nobs_sb1(“nobs_sb1”,“nobs_sb1”,n_sb1,0,10);//sideband1_measurement
RooRealVar nobs_sb2(“nobs_sb2”,“nobs_sb2”,n_sb2,0,10);//sideband2_measurement
RooRealVar tau_sb1(“tau_sb1”,“tau_sb1”,tau1);//propagation factor for sb1
RooRealVar tau_sb2(“tau_sb2”,“tau_sb2”,tau2);//propagation factor for sb2
RooRealVar sigma_tau_sb1(“sigma_tau_sb1”,“sigma_tau_sb1”,sigma_tau1);//error on tau1 in %
RooRealVar sigma_tau_sb2(“sigma_tau_sb2”,“sigma_tau_sb2”,sigma_tau2);//error on tau2 in %
RooRealVar sigma_sb1(“sigma_sb1”,“sigma_sb1”,sigma_sideband1);//error on sideband1 in %
RooRealVar sigma_sb2(“sigma_sb2”,“sigma_sb2”,sigma_sideband2);//error on sideband2 in %
RooRealVar b1(“b1”,“b1”,n_sb1/tau1,0,10);//background expectation1
RooRealVar b2(“b2”,“b2”,n_sb2/tau2,0,6);//background expectation2
RooRealVar delta_tau_sb1(“delta_tau_sb1”,“delta_tau_sb1”,0,-3sigma_tau1,3sigma_tau1);//nuisance parameter for tau1
RooRealVar delta_tau_sb2(“delta_tau_sb2”,“delta_tau_sb2”,0,-3sigma_tau2,3sigma_tau2);//nuisance parameter for tau2
RooRealVar delta_sb1(“delta_sb1”,“delta_sb1”,0,-3sigma_sideband1,3sigma_sideband1);//nuisance parameter for sb1
RooRealVar delta_sb2(“delta_sb2”,“delta_sb2”,0,-3sigma_sideband2,3sigma_sideband2);//nuisance parameter for sb2
RooRealVar mean_tau1_uncertainty(“mean_tau1”,“mean_tau1”,0);
RooRealVar mean_tau2_uncertainty(“mean_tau2”,“mean_tau2”,0);
RooRealVar mean__sb1_uncertainty(“mean__sb1”,“mean__sb1”,0);
RooRealVar mean__sb2_uncertainty(“mean__sb2”,“mean__sb2”,0);

nobs_sb1.setConstant();
nobs_sb2.setConstant();
tau_sb1.setConstant();
tau_sb2.setConstant();
sigma_tau_sb1.setConstant();
sigma_tau_sb2.setConstant();
sigma_sb1.setConstant();
sigma_sb2.setConstant();
mean_tau1_uncertainty.setConstant();
mean_tau2_uncertainty.setConstant();
mean__sb1_uncertainty.setConstant();
mean__sb2_uncertainty.setConstant();

//poisson for main measurement
RooFormulaVar mean_main(“mean_main”,“b1+b2”,RooArgSet(b1,b2));
RooPoisson main_measurement(“main_measurement”,“main_measurement”,nobs_main,mean_main);

//poisson for sb1
RooFormulaVar mean_sb1(“mean_sb1”,“tau_sb1*(1+delta_tau_sb1)b1(1+delta_sb1)”,RooArgSet(tau_sb1,delta_tau_sb1,b1,delta_sb1));

RooPoisson sb1_measurement(“sb1_measurement”,“sb1_measurement”,nobs_sb1,mean_sb1);

//poisson for sb2
RooFormulaVar mean_sb2(“mean_sb2”,“tau_sb2*(1+delta_tau_sb2)b2(1+delta_sb2)”,RooArgSet(tau_sb2,delta_tau_sb2,b2,delta_sb2));

RooPoisson sb2_measurement(“sb2_measurement”,“sb2_measurement”,nobs_sb2,mean_sb2);

//Gaussians for the uncertainties for tau1,2, sb1,2
RooGaussian tau1_uncertainty(“tau1_uncertainty”,“tau1_uncertainty”,delta_tau_sb1,mean_tau1_uncertainty,sigma_tau_sb1);
RooGaussian sb1_uncertainty(“sb1_uncertainty”,“sb1_uncertainty”,delta_sb1,mean_tau2_uncertainty,sigma_sb1);
RooGaussian tau2_uncertainty(“tau2_uncertainty”,“tau2_uncertainty”,delta_tau_sb2,mean__sb1_uncertainty,sigma_tau_sb2);
RooGaussian sb2_uncertainty(“sb2_uncertainty”,“sb2_uncertainty”,delta_sb2,mean__sb2_uncertainty,sigma_sb2);

//Build likelihood as product of those
RooArgList pdfs(main_measurement,sb1_measurement,tau1_uncertainty,sb1_uncertainty,tau2_uncertainty,sb2_uncertainty,sb2_measurement);
RooProdPdf raw_likelihood(“raw_likelihood”,“raw_likelihood”,pdfs);

//create projection
RooAbsPdf *likelihood= raw_likelihood.createProjection(RooArgSet(delta_tau_sb1,delta_sb1,b1,delta_tau_sb2,delta_sb2,b2));

nobs_main.setRange(“events”,0,25.48);
RooAbsReal *p_value=likelihood->createIntegral(nobs_main,RooArgSet(nobs_main),“events”);
cout<<“significance by frequentist integration is”<<PValueToSignificance(p_value->getVal())<<endl;

The model should work all right, but when I try to execute I get a warning about possible lack of numeric precision and a segmentation violation.

Maybe you can help me

Thank you very much Markus

Hi Markus,

The lack of numeric precision may be a real issue, but it may not actually affect you. What is more unfortunate is that there is a bug in the printout of that warning message that causes the SEGV. This has been fixed in 5.25/02. If you give that version a try the SEGV will go away.

At that point you should see if you are actually affected by the numeric precision issue: you get the warning if the calculated error does not meet the 1e-7 target precision, which may be overkill for some applications.
The most sensitive procedure is MINUIT minimization, which typically
needs 1e-6 precision to avoid instabilities in convergence. If you only
plot the integrated likelihood you might be fine with precisions down to 1e-3. You might be able to improve this by reconfiguring the integration
algorithm to allow for a larger number of samplings. For 2D integration
the parameter is set as follows