TMultiLayerPerceptron with TMinuit FCN in unbinned fit

Dear ROOTers,

I’m interested in using TMultiLayerPerceptron to model a 2-dimensional distribution. Rather than relying on the typical minimization scheme of MLP, I’d rather access the function itself so that I can find the weights of the neural net from an unbinned extended maximum likelihood fit, using Minuit to minimize the log Likelihood of the normalized pdf I get from a MultiLayerPerceptron.

I see that TMVA has set up a function to use Minuit as a minimizer:
void TMVA::MethodMLP::MinuitMinimize()

and it appears that one just needs to access the Synapses directly from the TMultiLayerPerceptron class, as I see in void TMVA::MethodMLP::FCN
for (Int_t ipar=0; ipar<fNumberOfWeights; ipar++) {
TSynapse* synapse = (TSynapse*)fSynapses->At(ipar);
synapse->SetWeight(fitPars[ipar]);
}

After setting the weights I assume that I could just evaluate the neural network from the TMultiLayerPercetron::Evaluate(); from which I can get the value of the pdf for a given set of input parameters.

So, if the idea I have in general is correct, I need to know the following.

How can I access the synapses from a TMultiLayerPerceptron object (in TMVA I see some global fSynapses)?

If the synapse weights are being updated in the Minuit FCN, can I simply call the Evaluate function in TMultiLayerPerceptron? In TMVA, a method called CalculateEstimator is called, but this seems to be specific to TMVA.

Any guidance is appreciated,
Ryan Mackenzie White