Setting ADAM optimizer parameters

Dear all,
I started working with TMVA’s DNN (DL) recently and I would like to compare some results my group had previously using sklearn library in python. For this, I would like to set as many parameters as possible the same we had in python, including the configurations of the ADAM optimizer. Still, I couldn’t find if it’s possible to change the “beta1”, “beta2” and “epsilon” from their default values (actually only the epsilon must be changed).
Could you confirm if that is possible? I’m using ROOT V6.20.

Thanks in advance.

Hi,
It is possible to change from the MethodDL configuration string only the learning rate and unfortunatly not teh other parameters (beta1,beta2 and epsilon). The other parameters are fixed to their default values as shown in the above link. It is very easy to patch the code and changes, it is when the TAdam class is constructed, see root/MethodDL.cxx at d935aa24289524f35e9359a7f48e6694ed4227dc · root-project/root · GitHub

If you need I can easily add a PR implementing this option.

Lorenzo

1 Like

Sorry the delay.
Indeed, if you could, that would be great!
Thanks for the fast response and for your time. :smile:

A PR is now open adding this feature, see [TMVA] Add capability to pass optimizer options in training string for MethodDL. by lmoneta · Pull Request #7318 · root-project/root · GitHub

1 Like