I’m practicing how to use DNN in TMVA.
I’d like to use adam optimizer for my network, but I can’t find where and how should I set it.
I can only find the way through pymva.
I can find ‘Adam.h’, ‘Optimizer.h’, … headers in $ROOT directory, so I think there should be a way to set the optimizer in C++ TMVA.
Where and how can I apply an optimizer on my DNN method?
If you are using the kDL method (as in TMVAClassification.C) ADAM is the default optimiser. You can change it by using the Optimiser option string. (e.g. Optimizer=ADAGRAD ) will use the Adagrad optimizer
Actually, I was using kDNN, but it seems better to use kDL.
Anyway, would there be any disadvantage to use kDL rather than kDNN?
Additionally, I can not find SOFTMAX activation function in kDL, which exists in kDNN. I want to use softmax function at the output layer, and then to use (categorical_)crossentropy loss function.
How should I do?