Filling a TTree with a TBits object? Can it be done?


I am wondering if it is possible to fill a TTree with a TBits object? I have an analysis function that makes cuts on N-tuples. When declared it looks like the following:

Double_t analysis(string filename, TTree* toutput, Double_t xsec, Double_t Filter, Bool_t boosting,…etc)

I currently fill the events after being cut to an output tree to be used in my main code later.

I am experimenting with using TBits to indicate exactly which cuts are passed or not.
With the purpose of using that information and TestBitNumber to select events that pass exactly a certain combination of cuts.
For example: if an event fails a cut … mybits.SetBitNumber((n),1);

To do that I need to output it to my main function. How can I fill a TBits object to a tree, or is there a better way to do this?

Thanks very much,

It looks like you can use the TBits::Set and TBits::Get to convert to/from a regular C array or even just an Int_t (or UInt_t, etc). As long as you know how many bits are actually present (the number of cuts), then you can save the value in an integer TBranch.

You could also use the std::bitset object and store that in the TTree, but it also has a more-obvious method for converting to/from an integer type:
That page mentions using std::vector for dynamic sizing, but IIRC the vector implementation behaves differently than other std::vectors. I remember seeing some posts about it being difficult to include in a TTree.


Thank you that was very helpful!

Last question:
I am having trouble using Set() and Get(). Here basically is my code, I hope you can tell me what I am doing wrong.

TBits mybits(4);
Int_t array[4] = {0,1,0,1};
Int_t getarray[4]= mybits.Get();


It looks like when using Set() and Get() with integer arrays, that the bit values are the full binary representation of the integers, not that the arrays should be arrays of 0- or 1- values.


root [0] TBits b(8)
root [1] Int_t v = 3
root [3] b.Set(8,&v)
root [4] b.Print()
 bit:   0 = 1
 bit:   1 = 1
root [5] v = 7
(const int)7
root [6] b.Set(8,&v)
root [7] b.Print()
 bit:   0 = 1
 bit:   1 = 1
 bit:   2 = 1
root [8] v = 15
(const int)15
root [9] b.Set(8,&v)
root [10] b.Print()
 bit:   0 = 1
 bit:   1 = 1
 bit:   2 = 1
 bit:   3 = 1

The bits that are set in “b” correspond to the value of “v” when written in binary. I don’t know when the “nbits” parameter starts to matter, and the numbers of bits in an integer is platform-dependent, so this looks tricky.

I might instead recommend using the SetBitNumber() and TestBitNumber() values in a loop. That’s probably less tricky to get right, and allows you to use an enum to have meaningful names for your cut bits. E.g.:

root [0] enum {EnergyCut, MomentumCut, HelicityCut};
root [1] EnergyCut
(const enum )0
root [2] TBits cuts(3)
root [3] cuts.SetBitNumber(MomentumCut,kTRUE)
root [4] cuts.SetBitNumber(HelicityCut,kFALSE)
root [5] cuts.Print()
 bit:   1 = 1
root [6] cuts.TestBitNumber(EnergyCut)
(const Bool_t)0

Yes that works thank you!
Interesting way of setting bits…