How to use TMVA methods in a older version of root

Hi,
I’ll try to explain my problem the best I can.

I’m trying to use my TMVA trained methods, a BDT and a DNN (i.e. trying to use TMVA:Reader), in an enviroment (CMSSW) that use a different version of root (6.06/01) from the one on which I have trained my methods (6.10/06). This causes the following error:

— BDT : The following options were specified, but could not be interpreted: ‘RegressionLossFunctionBDTG=huber:HuberQuantile=7.000000e-01:SkipNormalization=False’, please check!
***> abort program execution

First question: is there a simple workaround?

I tried (unsuccessfully) to link my libTMVA.so (new ver) in the software. After a couple of tries I think the compiler/linker accepted my lib but can’t use it, giving this error:

undefined reference to `TMVA::Reader::Reader(TString const&, bool)’

I’m new to object oriented programming so I’m trying to try and error my way out.
What I have tried to do so far:
-removing the linking specification for the tmva lib of the root version implemented in the code
-adding the linking to the new ver of libTMVA.so
-trying to replace #include “TMVA/Reader.h” with #include “/tmva/tmva/inc/TMVA/Reader.h” (a folder copied from the source code of the new version of root).

I think that somehow my lib is compiled but I cant get the code to use its object.

Can anyone help me get out of this mess?

Thank you,
Alberto

Unfortunately, I think the best approach here is to retrain your classifier in the old version of TMVA. We ensure that methods trained in an old version of TMVA will work in a newer one, but not the other way around.

Cheers,
Kim

Is there a way to use an old version of tmva without recompile all the root package? Because it takes hours.

Alberto

If you can’t run the training on the machine you run the application on you could use lxplus and activate an appropriate root version e.g. /cvmfs/sft.cern.ch/lcg/app/releases/ROOT/6.06.02.

Note that 6.06/01 is a development release and should not be used in production AFAIK.

Cheers,
Kim