First of all, thank you for your help!
In TMVA users guide, I found that TMVA’s SVM implementation supports linear, gaussian, sigmoidal and polynomial kernel functions. But I do not understand the way to use them. Are they implicitly used in every training with the parameters used (C, Tol, MaxIter, NSubSets, Gamma)? Or is there some kind of option like “kernel=Gaussian” or something like that?
I also would like an optimization example. I have tried the OtimizeAllMethodsForClassification(“ROCIntegral”,“Scan”) but is not yet implemented. Do I have to go through the parameter space “manually” or is there another option already implemented?
well, unfortunatly none the ‘optimisation’ of the configuration parameters is yet implemented, but for
the BDT (and even this is only some experimental trial version)
Now, it seems that in the current SVM version, only the Gaussian Kernel is used (I don’t know why the
authers have removet the linear/polynomial kernels, but in the end, the gaussian is the most useful one
The description of the other config parameters C,Tol,MaxIter,NSubSets etc you can find in the users guide, or isn’t that clear enough? (well perhpas not VERY clear, but …with a bit of phantasy one would be able to work it out. I’d say:
C = sigma of the gaussian
Tol = tolerance of the margin (i.e. how to punish misclassified events in the training, that’s the other
regularisation parameter next to C
MaxIter etc are some parameters related to “how” much effort should be spent in the numerical fitting of the margin …
How does one know when the Gaussian kernel is being used? I mean, is there a switch which controls whether one is using a linear or nonlinear SVM, or does it only use nonlinear (and hence is always using a Gaussian kernel since no other options are available)?