I have 20 root files and I can just synthesize them and the synthesis comes out to 10G, but it is fine.
Now the problem is that when I scale each root file, the root file hadd after the scale doesn’t work, with the following error
terminate called after throwing an instance of 'std::bad_alloc'
what(): std::bad_alloc
Aborted (core dumped)
The errors is in the merge of THnsparse, but since the files which not scaled can be merged, I don’t think it’s a memory or file size issue. The only thing I did was to traverse a list inside each root, scale a factor, and then write the list to the new root. here is the simple code
Your suggested changes are great, but they don’t solve my problem, I’m not making an error in the data type discrimination, my THnsparse data is too large and after scaling a factor of 10^-6 to 10^-12, the hadd method doesn’t work to merge successfully, do you have any other ideas or suggestions?
Thanks for reporting this! I would suggest to either hand in a PR fixing this for ROOT or iterating over the filled bins scaling them one by one, instead of relying on Scale(). Sorry about that!