Option NCuts=-1 for optimal cuts in BDT training not implemented

Dear experts,

I am using ROOT version 6.10/09 and when trying to train a BDT using the option “nCuts=-1” I am getting the following warning messages:

WARNING : You had chosen the training mode using optimal cuts, not
WARNING : based on a grid of 200 by setting the option NCuts < 0
WARNING : as this doesn’t exist yet, I set it to 200 and use the grid

Is it true that this method is not implemented yet in the ROOT version I am using, or is something else likely going wrong? Thanks for the help.

regards,

Willem

Hi Willem!

The currently NCuts < 0 (i.e. full cut search) is unavailable for regression and classification with gradient boosting. If you are using classification with another optimisation technique you should find that the parameter behaves as you expect.

As for why this is not supported by gradient boosting (which is what I assume you are using) I cannot say without digging deeper into the issue.

Cheers,
Kim

Hi Kim,

Thanks for the quick reply. I am indeed using gradient boosting.

regards,

Willem