Using the HyperparameterOptimisation

Hi experts,
I am trying to do a hyper parameter optimization, but sadly I am not quite understanding the documentation. I have made a DataLoader and now tried to book a method the following way:

TMVA::HyperParameterOptimisation optim(dataloader);

    TStopwatch tw;

    optim.BookMethod(TMVA::Types::kMLP, "MLP",
                     "H:!V:NeuronType=tanh:VarTransform=N:NCycles=600:HiddenLayers=N+5:TestRate=5");

    optim.SetFitter("Minuit");
    optim.SetNumFolds(5);
    optim.SetVerbose(1);

This works fine, but does not optimize anything. What I would like to have is variable number of Cycles, Layers (and maybe nodes) and Learning Rate, so that for each combination the performance is evaluated and I can afterwards chose the best method available. How can I book the method such that the Optimization knows which variables to tune?
Thanks,
Emil

Hi @Emil_R,

thank you for your question, Maybe @moneta could take a look?

Cheers,
Marta

Hi,
Apologies for the late reply. The HyperParameterOptimization works at the moment only on the TMVA methods impelmenting the function TMVA::MethodBase::OptimizeTuningParameters.
Currently this is happening only for BDT and SVM methods and not MLP.
If you would like to have it for MLP, please open a GitHub issue and we might find the time to do it
Thank you

Lorenzo