Make sure that the NN output is between 0 and 1


i am training a NN and i would like to know how i can make sure that the NN output on which i will cut is between [0,1].

I can of course apply a sigmoide transformation on the output value to have it between 0 and 1). but this is pretty inconvenient, since the output neuron value is ~ 1 (but not within [0,1]) .

If i could access the max and min values for the output neuron, i could apply a linear transformation to have its value within [0,1]

Is there a simple way to directly set the output neuron to TNeuron::kSigmoid, instead of kLinear?
The only solution i found so far, was to add before the output layer one hidden layer with one node for which i know that the neuron is a TNeuron::kSigmoid right before.
thanks for your help, nabil.

hi again,
in the end, I found the answer for my question in this article:

K. Hornik et al., Multilayer feedforward networks are universal approximators,
Neural Networks, Volume 2 , Issue 5 (1989), Pages: 359 - 366.

sorry for the noise.