The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
LibSVM
Using different kernal functions (linear, polynomial, rbf and sigmoid) different set of support vectors have been extracted, why and how???
with the change of parameter for the kernals different set of support vectors, why and how???
could anyone please explain me this, i would be thankful to you.
thank you very much...
with the change of parameter for the kernals different set of support vectors, why and how???
could anyone please explain me this, i would be thankful to you.
thank you very much...
0
Answers
phew, explaining this would be equivalent to explaining the whole idea of kernel based learning. Instead of replicating everything which was said and written at least a hundred times, I will give you some hints:
- let's say the data cannot be divided by a linear hyperplane in its input space but by a polynomial. Then the linear hyperplane will cause a lot of errors and hence support vectors
- the same applies for other kernel functions or inappropriate kernel parameters
If you have any difficulties in understanding those two hints I would suggest to learn much more about the way support vector machines work, e.g. at http://www.kernel-machines.org
Cheers,
Ingo