Keeping constant target nodes

Hi,

I am trying to train a MLP with training data that contains the same value for certain target nodes for all training inputs. However, when I try to call the trainallmethods method it says that I have to remove this target node because it is constant. I understand that having a output node that does not change in value for different training inputs does not add anything to the training process but I want to keep the MLP output as it is. Is there a way to disable this?

Kind regards,

Pim

Hi Pim,

I’m not sure if I understand the problem here. What is the motivation for keeping the output constant in the first place? Doing so would mean that it would not be possible to do back propagation from these outputs. Maybe spectator variables is what you’re looking for?

Cheers,
Sitong

Hi Sitong,

I do not want to keep the output node constant. It is the training value of this specific node that is constant for the whole training dataset. However, the training data that other users will supply might or might not have this. TMVA refuses to train a node that has the same training values in the whole training dataset and I would say that the back propagation would just train the MLP to give outputs close to this value.

Kind regards,

Pim

Hi Pim,

That’s strange. Can you post/send me the code snippet you use to register the variables and the method?

Cheers,
Sitong