To be able to run frequentist limit search (i.e. toys) I am running 10 toys per job and then want to merge the results. This gives me 1000 files of ~2.5 MB each that need to be merged. The issue is now that I can not merge those files as there seems to be a leak in how RooStats::HypoTestInverterResult
is loaded from file and then deleted. Even if I just load it and delete it afterwards, it leaks.
The code snippet used:
TDirectory::AddDirectory(false);
RooStats::HypoTestInverterResult *result{};
size_t count{};
for (const std::string &f : inputs) {
count++;
std::unique_ptr<TFile> file = TNLimitFinder::FS::openFile(f, &status);
if (status != 0) {
return status;
}
std::cout << "File #" << count << std::endl;
if (result == nullptr) {
result = file->Get<RooStats::HypoTestInverterResult>("result_mu_signal");
} else {
auto other = std::unique_ptr<RooStats::HypoTestInverterResult>(file->Get<RooStats::HypoTestInverterResult>("result_mu_signal"));
// result->Add(*other.get());
}
file->Close();
}
Files are open with TFile::Open
, I just use a helper function to find them on disk properly.
Loading 50 files of 10 toys each uses 1.5 GB of memory even if I do not keep the loaded object.