TMultiLayerPerceptron: Line search fail

I’m trying to train a neural network. When I normalize the output neuron, all goes fine, but if I omit normalization, the training breaks and I get the following message:

Epoch: 0 learn=6025.75 test=5468.55
Epoch: 10 learn=6023.38 test=5466.25
Error in <TMultiLayerPerceptron::TMultiLayerPerceptron::Train()>: Line search fail
Epoch: 500 learn=5096.01 test=4716.13
Training done.

What does “Line search fail” mean?

I’m using ROOT 4.03/02

“Line search fail” means that the algorithm was not able to find a minimum of the training error in the direction of the derivative. Line search is used as an intermediate step for most of the algorithms.

This means that the algorithm cannot converge, and it is often the consequence of a badly defined problem. This can happen if the inputs are not correlated to the output, or if too many useless variables are introduced.

Also, if the output is not normalized to [0,1], the algorithm cannot easily converge, as a sigmoid naturally returns something in that range. Only the global scale can change this, which limits the flexibility of the network and can lead to such a result.


1 Like