Initialisation of weights in tmultilayerperceptron class

Hello,

I understand that the initial weights in the TMultLayerPerceptron network are initialised to random values. I would like to build several TMultiLayerPerceptorn objects, each with a different number of neurons in the hidden layer. Then I want to train and test each of them on the same data sets and see if the number of neurons affects performance. However the values of the initial weights also effect the performance, so it seems I want to use the same initial values for the weights for each network.

I cant understand how to do this though. All I can see is the option to set the event weight in the constructor. Is this ‘event weight’ related to the initial weights? Why must I pass this as a char* and not, say, a Double_t. A weight sounds like a number to me so I am puzzled.

I notice I can retrain the same object with the same initial weights by passing the + option to Train. But I cant see a set function to change the number of neurons in the hidden layer, which would be one way to achieve what I would like to do.

I would be grateful of any advice on the event weight and how it might relate to what I want to do.

I am using root 3.10/02 btw.

Thanks,

Mark

[quote=“markhod”]Hello,
I cant understand how to do this though. All I can see is the option to set the event weight in the constructor. Is this ‘event weight’ related to the initial weights? Why must I pass this as a char* and not, say, a Double_t. A weight sounds like a number to me so I am puzzled.
[/quote]
Event weight and initial network weights are different thinks.
The event weight allows you to give more or less importance to each event in your sample during learning. The expected char* is the branch-based formula to be used for that.

What you want is to load initial weights into your network.
For that you have to use

TMultiLayerPerceptron::LoadWeights(Option_t* filename)

Beware! you have to use pseudo-random initial weights. If your weight matrix is symmetric, it will remain such during learning, which virtualy reduces the degrees of freedom.
One clean solution to prepare the file with the weights is:

if(firstTime) {
  myPerceptron.Randomize(); // initialize with random weights
  myPerceptron.DumpWeights("randomWeights.txt"); // save them as reference
}
else {
  myPerceptron.LoadWeights("randomWeights.txt"); // initialize with reference weights
}
myPerceptron.Train(1000,"+ graph text update=50");

Finaly, note that if you change the number of neurons in the network, you have to add/remove weights accordingly in the file.

[quote=“markhod”]
I notice I can retrain the same object with the same initial weights by passing the + option to Train. But I cant see a set function to change the number of neurons in the hidden layer, which would be one way to achieve what I would like to do.
Mark[/quote]
That’s it. Just prepare a weight file, load it, and then train the network with the “+” option.

Best,
Christophe.

Hello,

Thanks for all you help.

You said I should use the randomize function the first time through. However I thought the TMulitLayerPerceptron did randomisation of the weights automatically. …Could you calrify this please?

Thanks again,

Mark

[quote=“markhod”]
You said I should use the randomize function the first time through. However I thought the TMulitLayerPerceptron did randomisation of the weights automatically. …Could you calrify this please?
Mark[/quote]
Randomisation is done automatically before starting the training as long as you do not use the “+” option.
Here I just ment that the weights you put with LoadWeight must look random, and that Randomize is a good option to produce such a list of weights.

Best,
Christophe.