I am wondering if it is possible to fill a TTree with a TBits object? I have an analysis function that makes cuts on N-tuples. When declared it looks like the following:
I currently fill the events after being cut to an output tree to be used in my main code later.
I am experimenting with using TBits to indicate exactly which cuts are passed or not.
With the purpose of using that information and TestBitNumber to select events that pass exactly a certain combination of cuts.
For example: if an event fails a cut … mybits.SetBitNumber((n),1);
To do that I need to output it to my main function. How can I fill a TBits object to a tree, or is there a better way to do this?
It looks like you can use the TBits::Set and TBits::Get to convert to/from a regular C array or even just an Int_t (or UInt_t, etc). As long as you know how many bits are actually present (the number of cuts), then you can save the value in an integer TBranch.
You could also use the std::bitset object and store that in the TTree, but it also has a more-obvious method for converting to/from an integer type: http://www.cplusplus.com/reference/bitset/bitset/
That page mentions using std::vector for dynamic sizing, but IIRC the vector implementation behaves differently than other std::vectors. I remember seeing some posts about it being difficult to include in a TTree.
It looks like when using Set() and Get() with integer arrays, that the bit values are the full binary representation of the integers, not that the arrays should be arrays of 0- or 1- values.
The bits that are set in “b” correspond to the value of “v” when written in binary. I don’t know when the “nbits” parameter starts to matter, and the numbers of bits in an integer is platform-dependent, so this looks tricky.
I might instead recommend using the SetBitNumber() and TestBitNumber() values in a loop. That’s probably less tricky to get right, and allows you to use an enum to have meaningful names for your cut bits. E.g.: