Memory leak or normal about TChain?

I have a lot of files including one tree per file. In order to process them, I add all these files one by one into a TChain and loop all events. Like:

TChain *trkjettree=new TChain(“TrkJetTree”);
//read and add a list of all of my files to a TChain.
ifstream fin( filelist );
char buffer[256];
int Nfile=0;
while( !fin.eof() ) {
fin.getline( buffer, sizeof( buffer) );
if (buffer[0] == ‘\0’) continue;
rjettree->Add( buffer );
trkjettree->Add( buffer );
cout << "You are adding the root file: " << buffer << endl;

Long64_t maxBufEntries=20000; // set the maximum of buffer entries.
trkjettree->SetCircular( maxBufEntries );

//loop all events.
int Nevent_trk;
Nevent_trk = trkjettree->GetEntries();
for ( int ievent=0; ievent<Nevent_trk; ievent++)
// //load the entry.
trkjettree->GetEntry( ievent );

The total size of all these files is about 3.0 GB. As looping, with increased events loaded, the memory cost increases, at last when the memory cost reaches the limit (2GB on my computer), the code aborts. This problem belongs to the memory leak or nomal? I think maxBufEntries should work for the problem. But it seems no any effect.

ROOT Version: v4_04_02b_fbits_eh-GCC_3_4_3–opt
Linux: Linux2.4,

Could anybody point out my problem. Thanks.


Again, here all events included in all files are about 200k. As about 60k events are loaded, the memory cost reaches the limit (2.0GB).


This is an unusual behavior. Could you please provide a way to reproduce this problem?


I am also a Fermilab guy. If you like, you could get access to my test code at
My test code is named ‘test_chain.C’.

Before you run it, please run
$. setup.rc