Is there a way to build a recurrent neural network using the MLP constructor?
I built a FFNN with TMultiLayerPerceptron that works fine, but I now need to build a recurrent NN and I am thinking if there is a way to specify the type of NN as an argument in the constructor.
Thanks in advance.
I have asked Christophe Dealere (the author of TMultiLayerPerceptron) to answer your question. He seems to be away, so please be patient.
Rene
If you really need a recurrent NN, use SNNS (it’s an external application -> see www.snns.org). Then use snns2c to convert the file generated by snns to a C file and … use it under root.
YC
Well… I am back
Unfortunately, I don’t know what you mean by “recurrent neural network”.
In the constructor of the TMultilayerPerceptron, you can specify the “geometry” of the network, as well as the type of neuron and eventually the normalization.
Hi, NN experts,
my friend asked me - do we have GMDH algorithms (Grop Method of Data Handling,
described at www.gmdh.net) implemented in the recently introduced
"neural networks libs" (tmva, mlp etc.)?
These algorithms are used to build generic self-organizing statistically learning neural
networks.
If not , do you plan to add it?
Thanks. Regards. Valeriy
[quote=“delaere”]Well… I am back
Unfortunately, I don’t know what you mean by “recurrent neural network”.
In the constructor of the TMultilayerPerceptron, you can specify the “geometry” of the network, as well as the type of neuron and eventually the normalization.[/quote]
Hi Delaere. Sorry that I’m late but I’ve been out out of town.
If I’m not mistaking, the literature calls a NN with feedback from its outputs to its inputs, a recurrent NN. My little experience with ROOT MLP is to construct the NN declaring the inputs and outputs from the TTree. But I do not know how to construct a NN with feedback (if it si possible at all).
Thank you