ROOT and the Hadoop Distributed Filesystem (hdfs)

Hi,

i am planning to read ROOT-files from hdfs with ROOT and PROOF. Currently I am using Hadoop 1.04 and ROOT 5.34/01 on Ubuntu 12.04.

The current version of the THDFSFile does not seem to be compatible with the current hdfs c-api anymore. When I try to compile, I get the following error:

/opt/root/io/hdfs/src/THDFSFile.cxx: In constructor ‘THDFSFile::THDFSFile(const char*, Option_t*, const char*, Int_t)’:
/opt/root/io/hdfs/src/THDFSFile.cxx:97:60: error: too many arguments to function ‘void* hdfsConnectAsUser(const char*, tPort, const char*)’
/opt/hadoop/src/c++/libhdfs/hdfs.h:106:13: note: declared here
/opt/root/io/hdfs/src/THDFSFile.cxx: In constructor ‘THDFSSystem::THDFSSystem()’:
/opt/root/io/hdfs/src/THDFSFile.cxx:310:60: error: too many arguments to function ‘void* hdfsConnectAsUser(const char*, tPort, const char*)’
/opt/hadoop/src/c++/libhdfs/hdfs.h:106:13: note: declared here
make: *** [io/hdfs/src/THDFSFile.o] Error 1

Simply reducing the number of input arguments does not solve the problem. The default configuration which is used by passing host = “default” and tPort = 0 to hdfsConnectAsUser does not seem to use the default configured hdfs filesystem, but the local filesystem instead. I just solved this issue by hardcoding my hdfs settings into THDFSFile, which is not the desired solution.

What is the version of HDFS the current THDFSFile was developed for?
It is quite hard to find any information about the combination of ROOT and hdfs on the web. I would be also interested in using PROOF to analyse data directly on hdfs. Are there any experiences?

Thanks in advance, Fabian