Convolution and sampling of multidimensional p.d.f.s

Hi,

A RooFFTConvPdf stores its full output (in all observables) internally in a RooHistPdf as the FFT convolution is calculated entire at once, in contrast to a regular pdf, where is each point in the pdf is evaluated at a time.

The default configuration of RooHistPdf that store this output is to have 2nd order interpolation enabled to promote smoothness of the output pdf, however RooHistPdf does not implement interpolation algortithms for >2 dimensions. This is not a (mathematically) fundamental limitation, but I have no generic N-dimensional interpolation algorithm at hand at the moment.

In the absense of that, there are two possibilities to work around this

a) request that no interpolation is performed in the caching RooHistPdf. This can be done in the ctor

Hi,

A RooFFTConvPdf stores its full output (in all observables) internally in a RooHistPdf as the FFT convolution is calculated entire at once, in contrast to a regular pdf, where is each point in the pdf is evaluated at a time.

The default configuration of RooHistPdf that store this output is to have 2nd order interpolation enabled to promote smoothness of the output pdf, however RooHistPdf does not implement interpolation algortithms for >2 dimensions. This is not a (mathematically) fundamental limitation, but I have no generic N-dimensional interpolation algorithm at hand at the moment.

In the absense of that, there are two possibilities to work around this

a) request that no interpolation is performed in the caching RooHistPdf. This can be done in the ctor:

RooFFTConvPdf(const char *name, const char *title, RooRealVar& convVar, RooAbsPdf& pdf1, RooAbsPdf& pdf2, Int_t ipOrder=2);

i.e. set the last parameter to zero.

b) request that the result be cached in a lower dimensional histogram. To do so call

 void RooFFTConfPdf::setCacheObservables(const RooArgSet& obs) 

 with the variables that you'd like to see cached. One option is to only cache the convolution
 observable. This will however trigger recalculation of the cache whenever one of your
 other (y,z) observables changes value, so it is potentially very time consuming, but not necessarily:
 it depends on how many events you have in your likelihood. If it is a small number (compared to
 the number of bins in the (y,z) space for caching) this might still work out OK.

Wouter

1 Like