Usage of TEfficiency with weights (positive, negative, global...)

Dear rooters,

I am currently computing a trigger efficiency, as a function of a given variable, by filling two histograms (before and after passing the trigger) and, from them, producing a TEfficiency object using the constructor TEfficiency(const TH1& passed,const TH1& total).
Now, many questions (and possibly issues) related with this procedure are coming to my mind. Let me summarize them in the following, hoping that someone can shed light on this.

  1. Negative per-event weight
    Some of the processes I am taking into account have negative per-event weights, coming from the MC generator. Of course this is problematic, as this means that, in a given bin, the “pass” histogram can have a bigger yield than the “total” histogram, making TEfficiency::CheckConsistency complain. I am thinking of dropping the negative-weighted events in my calculation but…is this safe?

  2. Global weight
    In my calculations, the event that populate the total and pass histograms come from a set of different processes. Clearly, I should “mix” them by taking into account the normalization factor σL/N_gen. If I understand correctly from TEfficiency documentation, in order to do this properly I should produce one TEfficiency object per process and then .Add() them. Is this correct?

  3. Positive per event-weight
    Related to question 2), and assuming the answer to 2) is yes, does this still hold if I have to multiply the global weight by a positive per-event weight, which may vary from one event to the other?

Thanks a lot to you all,
cheers,

Fabio.

Hi,

Thank you for your interesting questions…

  1. Negative weights: if in a bin the total sum of weights is negative is problematic. In case of total < pass due to some negative contribution in total but not in pass, I don’t know what one should do. As first approximation you can assume for those bins that the efficiency is 1, and maybe remove only this extra negative events in total.
  2. What you describe is correct .You can use Efficiency::Add in this case
  3. Efficiency supports per-event weights. But in that case you can estimate the uncertainty only using the normal approximation.

Cheers

Lorenzo

Dear @moneta,

thanks for your reply.

  1. I will think about what you propose, it may work
  2. Great, thanks
  3. Does this mean that I can safely fill my histograms doing something like h->Fill(myVariable, myEvtWeight*sigma*Lumi/Ngen), where myEvtWeight changes on a per-event basis, while sigma*Lumi/Ngen is constant throughout each process? And then, once I have produced the TEfficiency objets for each process, use TEfficiency::Add to merge into a single object?

Thanks again,
Fabio.

Hi,

If you weights varying for each single event, you can avoid to set a global Efficiency weight for each single process and use add to combine them. This is useful when you have different process, but each one is unweighted.
In your case you can fill the histograms as you suggested with eventWeight*processWeight and compute the efficiency.
The alternative is to compute an efficiency for each process, using only eventWeight and then merge the results setting a global processWeight for each process TEfficiency.
The result will not be the same, because both are approximations, but they should be similar

Best regards

Lorenzo

Hi @moneta,

I decided to go with this approach and it works nicely. Thanks again for your feedback,

best,
Fabio.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.