Hi,
I have a lot of files including one tree per file. In order to process them, I add all these files one by one into a TChain and loop all events. Like:
[code]
{
TChain *trkjettree=new TChain(“TrkJetTree”);
//read and add a list of all of my files to a TChain.
ifstream fin( filelist );
char buffer[256];
int Nfile=0;
while( !fin.eof() ) {
fin.getline( buffer, sizeof( buffer) );
if (buffer[0] == ‘\0’) continue;
rjettree->Add( buffer );
trkjettree->Add( buffer );
cout << "You are adding the root file: " << buffer << endl;
Nfile++;
}
Long64_t maxBufEntries=20000; // set the maximum of buffer entries.
trkjettree->SetCircular( maxBufEntries );
//loop all events.
int Nevent_trk;
Nevent_trk = trkjettree->GetEntries();
for ( int ievent=0; ievent<Nevent_trk; ievent++)
{
// //load the entry.
trkjettree->GetEntry( ievent );
}
}[/code]
The total size of all these files is about 3.0 GB. As looping, with increased events loaded, the memory cost increases, at last when the memory cost reaches the limit (2GB on my computer), the code aborts. This problem belongs to the memory leak or nomal? I think maxBufEntries should work for the problem. But it seems no any effect.
ROOT Version: v4_04_02b_fbits_eh-GCC_3_4_3–opt
Linux: Linux2.4,
Could anybody point out my problem. Thanks.
Cheers,
Zhiyi.