Optimizer in TMVA::DNN?

Hi, everyone?

I’m practicing how to use DNN in TMVA.
I’d like to use adam optimizer for my network, but I can’t find where and how should I set it.
I can only find the way through pymva.
I can find ‘Adam.h’, ‘Optimizer.h’, … headers in $ROOT directory, so I think there should be a way to set the optimizer in C++ TMVA.

Where and how can I apply an optimizer on my DNN method?

@moneta perhaps you can help?

1 Like

Or may be @kialbert ?

1 Like

Hi,

If you are using the kDL method (as in TMVAClassification.C) ADAM is the default optimiser. You can change it by using the Optimiser option string. (e.g. Optimizer=ADAGRAD ) will use the Adagrad optimizer

Lorenzo

1 Like

Actually, I was using kDNN, but it seems better to use kDL.
Anyway, would there be any disadvantage to use kDL rather than kDNN?
Additionally, I can not find SOFTMAX activation function in kDL, which exists in kDNN. I want to use softmax function at the output layer, and then to use (categorical_)crossentropy loss function.
How should I do?

Hi,

Yes it is better to use kDL which is the new implementation supporting different types of layers.
When you define for the network a cross-entropy loss function the softmax is applied automatically before (i.e it is included in the loss function calculation). See for example
https://root.cern.ch/doc/master/Cpu_2LossFunctions_8hxx_source.html#l00077

To avoid applying 2 times the Softmax function, you need just to set to LINEAR the activation for the last layer, as it is done in the TMVAClassification.C example macro:

Lorenzo

1 Like