TMultiLayerPerceptron: maximum number of neurons?

Hi all,

I am working with ROOT 4.00/08.

In the Reference Guide for the appropriate ROOT version, I read

From this I understand that in ROOT TMLP there is no limit on the number of neurons. I searched this forum but I did not find any other clue on the topic

But when I train a network with the structure 8(input):32(hidden, in a single layer):1(output) , I obtain the following output:

As you can see, the training seem to proceed and terminate well. The plots are in agreement with expectations. But there is the line

“”““Error in : index 32 out of bounds (size: 32, this: 0x09137874)””"

This error does not appear with 17,12,8,4 neurons in the hidden layer
I almost forgot: for the output analysis I use the functions of TMLPAnalyzer I found in the root.cern.ch/root/html/examples/mlpHiggs.C.html tutorial

So is there any limit on the maximum number of neurons one can build the net with? And have I to consequently reject the output of the training with 32 hidden neurons?

Thanks in advance
Pietro

Hi,

the error message comes from indexing neuron number 32. But there are only 32 neurons, so the last one has the index 31 - remember this is not fortran, so we start counting at 0. Could you send a piece of working code that shows this error?

Cheers, Axel.

Hi,

Indeed, there is no limitation on the number of neurons, so there is no explanation for this error message (appart from a bug).

From the output you provide it is not clear when this happens, and I cannot reproduce it (there is no error when putting 32 neurons in mlpHiggs.C). Could you provide the minimal code to reproduce this? Maybe you do something between the training and the export that is not done in mlpHiggs.C ?

As Axel spotted, the message means one tries to access the 33th neuron in a row of 32.
Christophe.

Hi! This is my piece of code…

[quote] TMultiLayerPerceptron * myNN = new TMultiLayerPerceptron(“NN_etjetl5,NN_deteta,NN_emfjet,NN_mass,NN_phijet,NN_ntrk_tot,NN_pt_tot,NN_metprj:32:NN_wdiff”,“NN_type”,myNN_tree,“Entry$%2”,"(Entry$+1)%2");
myNN->SetLearningMethod(TMultiLayerPerceptron::kBFGS);

cout << “Starting the training…” <Train(100,“text,graph,update=10”);
cout << " Finished training. " <Export(“NN_trained”,“C++”);
cout << " Exported. " << endl;[/quote]

It seems that the only thing I do between training and exporting is the setting of learning method, Could it be the problem?

By the way, I have another question, but I do not want to open too much threads, so I post it here:

up to my actual knowledge on ROOT neural networks, they are an instrument targeted mainly on pattern recognition between two different distributions. In other words, the TMLP training consists in teaching the network to discriminate between two different distributions with different characteristics. Am I right?
If this is true, one could say that it does not make any sense to train a network using only signal events, without background events, isn’t it? To use only signal events in training seems to me as a sculpting of the signal itself, something incorrect. Am I right or I am missing some important point?

Hi,
could you attach your code, please? As you might have seen the board messes code up if HTML is enabled and “<<” is involved…

TMLP can be used for classification and regression. For the first one you need both signal and background events to train, for the latter you have the notion of how much an event is a signal - using signal events would be sufficient, as long as you have some more and some less “signaly” events. Still, training with a wide range of classes, and samples that are evenly weighted (by numbers or weights) over the different classes, definitely helps.

Cheers, Axel.

Sorry for the delay!

Here is the code
code.C (442 Bytes)