I am experience a quite strange issue that is having me scratch my head. It seems quite simple: I open a TFile, extract an object, read some numbers, and then close the TFile. When I do this in a loop, I see the memory consumption increasing continuously. I have traced it to this very specific block of code by including / removing it in the running loop.
The code looks like this:
template <typename T>
T *fetch(TFile *f, const std::string &nm, T *def = NULL, bool clone = true) {
if (f == NULL || f->IsZombie()) {
return def;
}
TObject *obj(0x0);
if (clone) {
gROOT->cd();
obj = f->Get(nm.c_str())->Clone((nm + "_clone").c_str());
} else {
obj = f->Get(nm.c_str());
}
if (obj != NULL) {
return dynamic_cast<T *>(obj);
} else {
return def;
}
}
for(unsigned i = 0; i < 10000; ++i) {
TFile* f = TFile::Open("/path/to/file.root", "read");
TGraph* g = fetch<TGraph>(f, "graph", NULL, false);
int n = g->GetN();
delete f;
}
Perhaps you might object to the delete f, but I tried f->Close(); and see the same thing: with every loop, memory consumption increases.
Aren’t those TGraphs leaking? You get them from the file, but they don’t get deleted. I’m not sure that the TGraph will be deleted by closing the file.
There is the nice GetObject, which can be used like this:
T* obj = nullptr;
f->GetObject(name, obj)
It will do the type check and conversion. If the types don’t match, it leaves obj == nullptr.
As to the TGraph being deleted, I was always under the impression that it would be owned by the TDirectoryFile in which it is so that TFile::Close would clean it up. I’ll try a separate delete, however if my memory serves correct this created a segmentation fault when I tried this in my litany of vain attempts to resolve the issue.
on all objects that I want to manage myself. This way, they lose their association with the file, and you can do normal C++ memory management.
I seem to remember, however, that graphs anyway don’t take part in the “file memory management”, because they don’t have a SetDirectory().