Unbearably slow TChain issues

Hello,

I am trying to string lots of small files together and then do an analysis on that mass of data. I can make the program run, but its so slow as to be functionally unusable. Here is the relevant snippet:

TChain* events = new TChain("events");
TString list_file = "list.txt";
ifstream in;
in.open(list_file);
TString run_num;

while(true) {
        //Set up the string handling
         TString run_prefix = "Run0";
         //Read each individual line from list.txt into the variable run_num;
         in >> run_num;
         //If you reach the end of file or something otherwise is fucked up, quit the file.
         if(!in.good()) break;
         run_prefix+=run_num;
         //Set up the logic for the allpulses set of file. Each root file has one and only one associated with it.
         TString pulses = run_prefix;
         pulses+="_allpulses.root";
         //2Chainz
         TChain* run_chain = new TChain("events");
         TChain* pulse_chain = new TChain("pulse_info");
         run_prefix+=".root";
         TString run_file = "/file/path" + run_prefix;
         TString pulse_file = "/file/path" + pulses;
         //cout<<pulse_file<<std::endl;
         //Add all the chains together.
         pulse_chain->Add(pulse_file);
         run_chain->Add(run_file);
         events->Add(run_chain);
         events->AddFriend(pulse_chain);
         //Delete all the unnecessary objects to prevent a segmentation fault.
        delete run_chain, run_file, pulse_file, run_prefix;

Each event in the chain events is then put through the analysis. Does anyone know why this is unbearably slow?

Dear joshuacarey,

Can you elaborate more on what you mean by “unbearably slow”?
Did you check that the final “event” is what you wanted (w/ events->ls() or events->Print())?
How many data the final “events” contain?
How does it compare with processing a single file?

Also note that in your snippet run_file and pulse_file are not on the heap, so you should not ‘delete’ them.

G Ganis