hi,
could anyone shed some light on the following?
I am trying to understand how the TEfficiency
streamer works.
dumping its StreamerInfo
, I get:
StreamerInfo for "TEfficiency" version=2 title=""
BASE TNamed offset= 0 type= 67 size= 0 The basis for a named object (name, title)
BASE TAttLine offset= 0 type= 0 size= 0 Line attributes
BASE TAttFill offset= 0 type= 0 size= 0 Fill area attributes
BASE TAttMarker offset= 0 type= 0 size= 0 Marker attributes
double fBeta_alpha offset= 0 type= 8 size= 8 global parameter for prior beta distribution (default = 1)
double fBeta_beta offset= 0 type= 8 size= 8 global parameter for prior beta distribution (default = 1)
vector<pair<double,double> > fBeta_bin_params offset= 0 type=500 size= 24 parameter for prior beta distribution different bin by bin
double fConfLevel offset= 0 type= 8 size= 8 confidence level (default = 0.683, 1 sigma)
TList* fFunctions offset= 0 type= 63 size= 8 ->pointer to list of functions
TH1* fPassedHistogram offset= 0 type= 64 size= 8 histogram for events which passed certain criteria
TEfficiency::EStatOption fStatisticOption offset= 0 type= 3 size= 4 defines how the confidence intervals are determined
TH1* fTotalHistogram offset= 0 type= 64 size= 8 histogram for total number of events
double fWeight offset= 0 type= 8 size= 8 weight for all events (default = 1)
so far so good: I could decode the base classes and all but my code chokes on the vector<pair<double,double> >
.
the on-disk representation, just before the vector<...>
bits is: (ie: just after having read the 2 double
fBeta_alpha
and fBeta_beta
)
--- hex --- (pos=148 len=128 end=2820)
00000000 40 00 00 0c 40 09 00 00 00 d7 be d2 00 00 00 00 |@...@...........|
00000010 3f e5 d8 97 a2 41 a3 f5 40 00 00 11 00 05 00 01 |?....A..@.......|
00000020 00 00 00 00 03 00 00 00 00 00 00 00 00 40 00 05 |.............@..|
00000030 65 ff ff ff ff 54 48 31 44 00 40 00 05 58 00 03 |e....TH1D.@..X..|
00000040 40 00 02 1e 00 08 40 00 00 2e 00 01 00 01 00 00 |@.....@.........|
00000050 00 00 03 00 00 00 0a 65 66 66 5f 70 61 73 73 65 |.......eff_passe|
00000060 64 16 6d 79 20 65 66 66 69 63 69 65 6e 63 79 20 |d.my efficiency |
00000070 28 70 61 73 73 65 64 29 40 00 00 08 00 02 02 5a |(passed)@......Z|
trying to read the version: u32: 0x4000000c
and applying the 0x40000000
byte mask leads to 12
as a byte count, and then extracting the version u16: 0x4009
leads to 16393
, except it should be just 9
.
why is there what seems to be a 0x4000
mask applied?
under which conditions is it present (or: what does it mean) ?
am I ok to assume I can just:
if ( (v & mask) != 0) {
v = v & ~mask;
}
applying this change, breaks a unit test of mine trying to read back maps (with no-split):
StreamerInfo for "Event" version=1 title=""
map<int,int> mi32 offset= 0 type=500 size= 48
map<string,int> msi32 offset= 0 type=500 size= 48
map<string,string> mss offset= 0 type=500 size= 48
map<string,vector<string> > msvs offset= 0 type=500 size= 48
map<string,vector<int> > msvi32 offset= 0 type=500 size= 48
any hint? (@pcanal I guess :P)