Segmentation Violation using TTrees and numpy arrays

Folks,

I am seeing a seg violation when unpacking data from a TTree and putting it, eventually, into
a numpy structured array. I have been doing something very similar previously, but putting the data
into a regular numpy array, without any problems.

I know that one must be careful not to put anything that is essentially a reference into the numpy
array, and I put the data into intermediate variables first - hoping that this will prevent any problems.
I can’t tell if this is the source of the fault or not.

It is clearly some kind of memory corruption problem, especially since the seg viol. happens at different

places when I add/subtract code. I did comment out all the ROOT code, and put scalars into the intermediate variables, and then it ran fine. That is partially evidence that my code is ok - but who knows.

I’ve included the python script, the root file used, and the traceback below.

To reproduce the problem, just issue the command

decamMesher-bug.py -i donuts-20121003sequence1.root -f 138322 -o testing -e FN1

And, I'm using root 5.32/01 and python 2.7.3 on a Mac running OSX 10.7, 

thanks very much for any assistance,

Aaron
donuts-20121003sequence1.root (947 KB)
traceback.txt (3.99 KB)
decamMesher-bug.py (7.05 KB)

Hi,

I’m not seeing it: neither by reading, nor by running the code (it does complain about not having a return value from goodPoints(), though). Valgrind isn’t unhappier than usual either.

My first guess was a out-of-memory situation, but the total amount of memory used isn’t much.

No clue.

Cheers,
Wim