The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
"Lower accuracy of libsvm in rapidminer vs weka with same parameters"
Hi,
I am really getting confused. I used a same data set with same parameters in weka and rapidminer for classification with libsvm. the strange result was that weka accuracy for prediction was 15 percent higher than rapidminer???(I used same C and same kernel and same gamma for rbf kernal).I think there is something wrong with weka. it's accuracy on training set for testing phase is 100%.
Can any body explain it?
Thanks a lot.
I am really getting confused. I used a same data set with same parameters in weka and rapidminer for classification with libsvm. the strange result was that weka accuracy for prediction was 15 percent higher than rapidminer???(I used same C and same kernel and same gamma for rbf kernal).I think there is something wrong with weka. it's accuracy on training set for testing phase is 100%.
Can any body explain it?
Thanks a lot.
Tagged:
0
Answers
725 188 211
3 1 1
1 0 1
but weka with same parameters gives
729 0 0
124 65 0
111 0 101
I did not use any weight for classes in weka.Help me please and tell me what is wrong with rapidminer?
thank you.
Best, Marius
does the Weka LibSVM implementation still exist or was it removed? Somehow I cannot see any LibSVM Weka implementation anymore in the list...? Would like to try it out..
The version of Weka we implement is v3.6.9. But from this: https://weka.wikispaces.com/LibSVM it appears it's a 3rd party tool that needs to be downloaded and installed seperately. It looks like if you want to use the Weka implementaiton of LibSVM, you have to download it extra.
thanks, but it seems that the download links are broken..
I can't do anything about that. Maybe contact them, somehow?