I’m using T Multi Layer Perceptron which is asking for branches of a tree as input arguments (input neurons).
My network is not training properly, this may be because of the way I fill my tree (and hence my branches).
Here are all the tree related commands that I use in my code:
//Create tree called ZernikeTree
TTree *ZernikeTree = new TTree(“simulation”, “simulated data”);
//Declare variables that will be used as branches
float ZRX00, ZRX11, ZRX20;
//Create branches
ZernikeTree->Branch(“ZRX00”, &ZRX00, “ZRX00/F”);
ZernikeTree->Branch(“ZRX11”, &ZRX11, “ZRX11/F”);
ZernikeTree->Branch(“ZRX20”, &ZRX20, “ZRX20/F”);
//Fill in Tree from two different variables declared before (ZTR and ZTE)
//(Those variables have already been verified, their content is correct)
for (int i = 0; i < 625; ++i) {
ZRX00 = ZTR[ i ][ 25 ]; ZernikeTree->Fill(); ZRX00 = ZTE[ i ][ 25 ]; ZernikeTree->Fill();
}
//Same syntax for the two other branches.
Is there anything wrong with this syntax? If not, how could I verify the content of the branch to understand where the problem is coming from?
Thank you!