Hi TMVA Experts,
(I also posted this to the sourceforge site; I’m not sure which is more correct.)
I trained a MLP net in root version 5.28 and have had good success with the training and performance, etc. It’s an astounding package!
I had a colleaguge check my results, and he got bogus output, using ROOT version 5.30. I confirmed that the output of the Reader changes when I skip versions.
reader = new TMVA::Reader(“Debug”);
(Where net_id is a string labelling to fileX. I have 105 neural nets that are all being loaded to the same reader, corresponding to different eta regions, pt, and conversion…)
I then evaluate normally
This all worked fine in 5.28. Is there any reason that things should change to 5.30? I would have thought that the package reader output would be completely stable to changes between releases, given a set of weights – the same values in should give the same value out! Am I doing anything obviously wrong?
I’m sorry it is hard to post a ‘minimal example’ for a full NN implementation. My reader class is here: