Multi Layer Perceptron learning rate

I am trying to train a neural network on a rather heavy problem and I am facing the issue of having always the same inputs. Changing the learning rate of the network would be a good thing to try. Anyone knows how to do that on Multi Layer Perceptron?

If you also happen to know how to change the activation function that would be nice :slight_smile:

Thanks!!

I would recommend using TMVA and its DNN implementation for your problem (I assume that you are referring to the MLP of ROOT). TMVA is part of the ROOT standard distribution.

The TMVA User’s guide can be found here (https://github.com/root-project/root/blob/master/documentation/tmva/UsersGuide/TMVAUsersGuide.pdf), check out the section on DNN, there you can find all parameters that are tweakable.

Working examples and tutorials can be found by

cd $ROOTSYS/tutorials/tmva
root -l 'TMVAClassification.C("DNN")'

Cheers,
Kim

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.