Hi,
I wondered if anyone has done much with integration of root and Hadoop mapreduce? And if so, whether there are any workflow experiences that you would care to share?
A month or so ago I tried running a simple Root task (store values in TTree) using a Hadoop map task that used java JNI to call a C++ root routine that simply places data TTree. This routine used Root 5.32 and Hadoop 0.20.205 It encountered
a segmentation violation before the code is called during load of the native
libCore.so
library of root 5.32
The only error information I receive is
*** Break *** segmentation violation
after the library is loaded. Before this, I also explicitly load the libraries upon which libCore.so depends.
– Has anyone encountered this error?
I then assumed that Hadoop pipes would be better providing separation of Root and Hadoop.
Assuming that Root files are under HDFS, it seems best way to construct Root mapreduce workflow is to place root files in a temp local working directory, have pipes (C++ Root) tasks operate on them there, and then move results to HDFS if needed.
It seems that the preferable way would be to retrieve records from the Root files on the java side, pass those to Root map task via pipes interface, but the fact that libCore will not load seems to preclude this option.
– Are there any thoughts on addressing this?
Thanks
C
Hi,
Were you able to resolve this issue?
Philippe.