The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
"Leave One Out Cross Validation on SVM"
Hi all,
Applying Leave One Out Cross Validation is expensive, but there is very efficient way of applying it on SVM (as it can be applied only on the support vectors), is using this feature possible in Rapidminer?
best regards
Applying Leave One Out Cross Validation is expensive, but there is very efficient way of applying it on SVM (as it can be applied only on the support vectors), is using this feature possible in Rapidminer?
best regards
Tagged:
0
Answers
please explain how this could possibly work: The support vectors are determined on the set of examples during training phase each time the model is trained. On LOOCV this is exactly n times if we have n examples. How could you leave out examples if you don't even have a model to determine if they are Support Vectors or not?
Greetings,
 Sebastian