Behavior of TFile::Open() in a loop


ROOT Version: 6.26/06
Platform: Scientific Linux 7.3 (Nitrogen)
Compiler: None (macro)


Hi ROOT forum,

I am seeing some behavior that I don’t understand when I open TFiles within a loop. This may be an issue with my understanding of C++, but I am not sure.

My code (broadly) looks like this:

for(int i=0; i<nfiles; ++i)
  {
    TFile* file = TFile::Open(("myfile_"+to_string(i)+".root").c_str());
    TTree* tree = file->Get<TTree>("ttree");
    //some operations on file/tree
  }

The memory usage of my program climbs in increments of the size of my_file_i.root, which looks to me like a memory leak.

However, when I do this:

for(int i=0; i<nfiles; ++i)
  {
    TFile* file = TFile::Open(("myfile_"+to_string(i)+".root").c_str());
    TTree* tree = file->Get<TTree>("ttree");
    //some operations on file/tree
    delete file;
  }

The memory leak goes away. So obviously file is never deleted, and just hangs around in memory, and can’t be freed (that’s a memory leak alright).

Why does this happen? I would think that an object instantiated inside a loop dies at the end of the loop. Not so?

Thanks!

TFile::Open acts (for the purpose of this discussion) indirectly as calling operator new, so the code above is similar to:

for(...) {
   someclass *obj = new someclass;
}

where the object is allocated but never deleted (i.e. memory leaks).

Cheers,
Philippe.

‘Technically’ with TFile::Open this is not a leak but ‘hoarding’ as the file objects will be deleted … at the end of the process.