Hello, I am filling and writing a tree in a simulation macro. When the program begins writing to a new file I immediately get a segmentation violation:
Fill: Switching to new file: ngenauauyt_1.root
*** Break *** segmentation violation
I have varied the size of the file from 1.9G to 1.0G using SetMaxTreeSize and the crash always occurs upon creation of the new file. The created file ngenauauyt_1.root always has only 252 kb. The first file ngenauauyt.root has 1.9G or 1.0G as it should.
Where does the segmentation fault occurs? Are you using the file pointer after the switch (in which case you need to replace myfile by mytree->GetFile()).
When the original file ngenauauyt.root reaches its maximum size another file ngenauauyt_1.root is created automatically by ROOT and exactly at this point the segmentation fault occurs. I am going to make an array of files and write to a new file after a fixed number of events are processed to see if that works.
[quote]When the original file ngenauauyt.root reaches its maximum size another file ngenauauyt_1.root is created automatically by ROOT and exactly at this point the segmentation fault occurs.[/quote]This is neither normal nor expected. Can you send a running example reproducing this problem?
I have set the file size to 10M so that the maximum is reached right away. However, if the default size of 1.9G is used the result is the same, only it takes longer to fill. NGenAuAuYt.C (19.9 KB)
Oops, SetMaxTreeSize is currently at 1M, not 10M. Whatever it is set for, the segmentation fault occurs when the limit is reached and a second root file is created.
g++ can compile such code, because of c99 support (which has variable length arrays)
array size in C++ MUST be a constant expression, but your _kMaxClusters is not a constant expression, 200 is, but nclustermax is not (though it’s a constant). So, this is ill-formed C++ code.
Though you do not have this problem explicitly in your program, but in C++ you should avoid big arrays on a stack (probably, your arrays are small, but I cannot deduce this from your code) - on some platforms and some implementations stack size is limited and you’ll get stack overflow.
pPi[p] = cluster.GetDecay(p);
should be
pPi[p] = *cluster.GetDecay(p);
Int_t n must be declared outside the loop if you want to use it after loop.
“name” - TString name; redefined, ntracks redefined, fdNpdx is undefined (must be dNpdx) and we do not have a file with this hist to reproduce your crush.
As a side node in root -l -b NGenAuAuYt.C'(100000,3.,3.,3.,20,6.,18,.7)'++ > &! ngenauauyt.out & the++ are misplaced and it should readroot -l -b 'NGenAuAuYt.C++(100000,3.,3.,3.,20,6.,18,.7)' > &! ngenauauyt.out &
The problem is that the histograms are ‘attach’ to the file your use for the TTree (treeout) and thus are deleted when you close treeout (and when the tree switch the file over). In addition the way you close the treeout file is incorrect unless MaxTreeSize is set to infinite.
To solve these, after creating the histogram, i.e. around line 144, add: hParticipants->SetDirectory(0);
hClusters->SetDirectory(0);
hDecays->SetDirectory(0);
hRapidity->SetDirectory(0);
hBoost->SetDirectory(0);
hPhiB->SetDirectory(0);
and after the end of the loop use (i.e. around line 466):[code] // print tree to out file
With the SetDirectory(0), I can not reproduce any crash.
Can you re-run you crash with either gdb or valgrind to see where it really is? (Note you need to use root.exe rather than root: for examplevalgrind root.exe -q -l -b 'NGenAuAuYt.C++(100000,3.,3.,3.,20,6.,18,.7)'
Merci beaucoup! Works perfectly, and the simulation time for 1000 events is faster than before (~30s vs ~50s previously), not sure why but must be because of something you did.
How did you start your script? Since you were not previously compiling (due to the misplaced ++ and due the C++ syntax errors), you should now be able to compile and be much faster than the interpreted case.
Yes, I’m compiling now–that would explain it. Thank you for pointing out the correct C++ syntax. I wish I had learned C++ properly instead of ad hoc, it would have saved a lot of time.