Home | News | Documentation | Download

Problem with writing large TTrees

Hello!

I have a piece of code that works as follows:
[ul]
[li] Open a TFile in UPDATE mode[/li]
[li] Create a TTree[/li]
[li] Write the TTree to the TFile[/li]
[li] Close the TFile[/li][/ul]
This procedure is repeated a number of times with different (new) TTrees, each having a unique name, but the same TFile (which should work, since it’s opened in UPDATE mode).
It is important to note that I do not know the number or properties of the other TTrees at the time of first call of Write.
However, when running the code, during the write of the second or third TTree (depending on the data), I get the following error message:

file probably overwritten: stopping reporting error messages ===>File is more than 2 Gigabytes

I already tried playing around with TTree::SetMaxTreeSize and TTree::SetMaxVirtualSize, but changing these to a larger value seems to have no effect whatsoever on the error message.

It is of crucial importance that all four TTrees are written to the same TFile, since this is the input format required for later processing, over which I have no control.

FYI, I’m running on lxplus6 with ROOT 5.34/08 and gcc 4.7 sourced from CVMFS. My code is an executable compiled with gcc against the root libraries.

Can anybody help me figure out what’s going on and fix this problem?

Hi,

I filed this issue at sft.its.cern.ch/jira/browse/ROOT-5540

Cheers,
Philippe.

Any progress on this issue?

I am having very similar issues under very similar circumstances.
I get the same error as above but I also get a bunch of these as well:

Error R__unzip_header: error in header
Error in <TBasket::ReadBasketBuffers>: Inconsistency found in header (nin=0, nbuf=0)
Error in <TBasket::ReadBasketBuffers>: fNbytes = 65882844, fKeylen = 0, fObjlen = 1255999220, noutot = 0, nout=0, nin=0, nbuf=0
Error in <TBranchElement::GetBasket>: File: data/rootfiles/InSANE72448.-1.root at byte:-1, branch:fPBigcal_C.fY, entry:67820, badread=1, nerrors=8, basketnumber=17

I am repeating the same analysis on many different runs and it only seems to happen only about 5-10% of the time. When I do get these errors, the code runs very slow but seems to finish.

Hi,

Were you able to track down the origin of this problem (it could have been a disk issue and/or a memory overwrite during writing (or less likely during reading)?

Cheers,
Philippe.

Hi,

A related problem has been recently fixed. Could you check whether this problem is indeed gone in v6.08/02.

Thanks,
Philippe.