I have a number of files I want to hadd and the trees include a weight branch.
They also each have a TParameter<double> containing the sum of weights (from a prior stage in the analysis), and I want to correct the normalization of the weight branch by multiplying by (sum_weights for that file / sum(sum_weights) for all files).
Does this mean I have to do this in single-threaded mode? If I’m reading the documentation correctly, I can’t do this in multithreaded mode, since the rdfentry_ numbers are in an unspecified order, so it might break down at file boundaries.