Hi guys,
I wrote a Macro that defines and trains a neural network which follows MLPhiggs.C :
[code]{
if (!gROOT->GetClass(“TMultiLayerPerceptron”))
{
gSystem->Load(“libMLP”);
}
//Open ROOT file containing data
//TFile input("trainclass.root");
TFile input("testclass.root");
//get data tree from opened ROOT file
TTree *CE = (TTree *) input.Get("CE");
Int_t NODEA, NODEB, NODEC, NODED, NODEE, linear, firstorder, secondorder, homogeneous;
CE->SetBranchAddress("NODEA", &NODEA);
CE->SetBranchAddress("NODEB", &NODEB);
CE->SetBranchAddress("NODEC", &NODEC);
CE->SetBranchAddress("NODED", &NODED);
CE->SetBranchAddress("NODEE", &NODEE);
CE->SetBranchAddress("linear", &linear);
CE->SetBranchAddress("firstorder", &firstorder);
CE->SetBranchAddress("secondorder", &secondorder);
CE->SetBranchAddress("homogeneous", &homogeneous);
//Build and train the Neural Network
TMultiLayerPerceptron *mlp = new
TMultiLayerPerceptron("NODEA,NODEB,NODEC,NODED,NODEE:5:4:linear,firstorder,secondorder,homogeneous" ,CE,"Entry$%2","(Entry$+1)%2");
mlp->Train(20, "text,graph,update=10");
}[/code]
When it tries to train the network I recieve this error:
[quote]Training the Neural Network
Error in TMultiLayerPerceptron::TMultiLayerPerceptron::Train(): Line search fail
Error in TMultiLayerPerceptron::TMultiLayerPerceptron::Train(): Stop.
Epoch: 20 learn=nan test=nan
Training done.
[/quote]
The learning curve that is displayed shows increasing errors.
All the input is either a 1 or a 0…but the same thing occurs if I attempt floats between 0 and 1, so I don’t think it is a normalization issue.
Thanks for any suggestions!
Johnny