MLP can not build the classifier plot

Dear all experts,

I have also tried the MLP model in TMVA package, but it shows:

MLP : [dataset] : Loop over test events and fill histograms with classifier response…
: initial interval w/o root: (a=1.79769e+308, b=-1.79769e+308), (Eff_a=0, Eff_b=0), (fa=-0.005, fb=-0.005), refValue = 0.005
: initial interval w/o root: (a=1.79769e+308, b=-1.79769e+308), (Eff_a=0, Eff_b=0), (fa=-0.015, fb=-0.015), refValue = 0.015
: initial interval w/o root: (a=1.79769e+308, b=-1.79769e+308), (Eff_a=0, Eff_b=0), (fa=-0.025, fb=-0.025), refValue = 0.025

And the MLP setup is:

factory->BookMethod( dataloader, TMVA::Types::kMLP, “MLP”, “H:!V:NeuronType=tanh:VarTransform=N:NCycles=600:HiddenLayers=N+5:TestRate=5:!UseRegulator” );

It seems that the MLP model can not build the classifier plot but I am not sure where I did wrong.

Anyone meet it before as well?

Thanks in advance for your help!


The MLP will be retained for backwards compatibility, but for all new uses, the DNN is recommended.

What you could try is changing the training algorithm (e.g. TrainingMethod=BFGS), changing the using batched learning (BPMode=batch) and changing the estimator type to use CE (cross entropy) for classification.


Dear Kim,

Yes it works! Thanks for the help!
Do you know where I can find the info about how to optimize those hyper-parameters?
Just like what you suggested, on TrainingMethod, BPMode and estimator type etc, those experience is so valuable to learn.(in the UserGuide those info is not that detailed…)

Kind regards,

1 Like


I don’t have any particular resource from the top of my head, but in general any article on DNN should be translatable to the MLP case. The main difference is that the DNN evolution has been to “simpler” training algorithms and more complex architectures. The principles behind are the same though.

The particular error you saw is because of an implementation detail in the TMVA MLP backpropagation, so details on this will be hard to find.