I’m experiencing slow file closing / slow hadding of files with many (a few 10k) TDirectories in them.
The file structure is like:
distrubution00001/Nominal distrubution00002/Nominal ... distrubution50000/Nominal
hadd needs a few hours to combine such files. If I change the structure to have no folders (replacing “/” by “_” in the example above) the time is reduced to a few seconds.
549 // Delete objects from directory list, this in turn, recursively closes all 550 // sub-directories (that were allocated on the heap) 551 // if this dir contains subdirs, we must use the slow option for Delete! 552 // we must avoid "slow" as much as possible, in particular Delete("slow") 553 // with a large number of objects (eg >10^5) would take for ever.
Is this unavoidable?