Hello!
I work for STAR experiment.Recently I try to retrieve the raw data from the remote cluster by creating root files storing selected events locally.
But some root files will always be made zombies , and that part is of a big portion of the whole data.
I checked the out put file, the last line shows that :
file probably overwritten: stopping reporting error messages
Abort
Starting copy block
I checked the source file in Branch.cxx , it shows that there are two occasions that this error message gonna show:
file is more than 2 GB
your file is bigger than the maximum file size allowed on your system.
I don’t think any of these two occasions gonna happen, first, I limit the tree size by using the SetMaxTreeSize method, second, the other similar size jobs run well.
Can anybody give me hint on this problem?
Thanks