The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
KNNLearner don´t use similarities
The new implementation in 4.2 of KNNLearner has changed and the use of SimilarityUtil.resolveSimilarityMeasure has been dropped. Now only the EuclideanDistance is defined as selectable.
¿how I can use my similarity measure in a KNNLeraner? ¿or anyone in the com.rapidminer.operator.similarity package?
Any idea will be accepted
Thanks in advance.
F.J. Cuberos
¿how I can use my similarity measure in a KNNLeraner? ¿or anyone in the com.rapidminer.operator.similarity package?
Any idea will be accepted
Thanks in advance.
F.J. Cuberos
Tagged:
0
Answers
yeah, we know. We completely re-implemented NaiveBayes and KNN since both were ridicously slow on larger data sets. During this re-implementation, we unfortunately had to remove the ID-based similarity measures from Michael Wurst (which were one of the main reasons why the KNN learner was slow) and started a new similarity measure hierarchy. Adding all other similarity measures again and also improving the clustering schemes by letting them use the new similarity measures is at the top of our todo list and will certainly be finished until the next release.
So for now, there are only two workarounds:
- stick to RM 4.1 until version 4.3 is published which will then contain all similarity measures.
- add the KNN implementation from version 4.1 to version 4.2 yourself (code and in operators.xml) and recompile RapidMiner
Sorry for the inconvenience this have caused but at the end we will have a much more efficient system (just as an info: the improvement for the KNN learners was factor 13 on average).Cheers,
Ingo
I´ve just added the 4.1 version to my plugin and I'll wait for the 4.3 version.
Thanks again, Ingo.
Regards.
F.J. Cuberos
Cheers,
Ingo