The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
Parallel Classifier Combination in Single Process
kashif_khan
Member Posts: 19 Contributor II
Hi ,
I am incorporating parallel classifier combination in rapidminer. I want to provide same vocabulary to all three classifiers in one go. I have a seperate test set and know that i need to provide word list output of "Process Documents From File" (training) to "Process Documents From File" (testing). I am accessing my process via java code in which i created a process and access it via java code.
How can i programmatically change classifier in a process ?
I am incorporating parallel classifier combination in rapidminer. I want to provide same vocabulary to all three classifiers in one go. I have a seperate test set and know that i need to provide word list output of "Process Documents From File" (training) to "Process Documents From File" (testing). I am accessing my process via java code in which i created a process and access it via java code.
How can i programmatically change classifier in a process ?
Tagged:
0
Answers
When including RapidMiner into your own software please keep in mind that RapidMiner is released under the AGPL, which means that also your code must be under the AGPL.
Best regards,
Marius
I have aproblem , i need to make combination of knn and svm , i want to aggregate the output of them . i am waiting your reply .
keep in touch
Best regards,
Marius
i have another question.
what is the difference between vote operator, bagging and stacking ???
and what is your advice
keep in touch
Bagging means to create several, let's say 10, bootstrapped samples from the same dataset, train a model with the same algorithm on each, e.g. a decision tree. On application again perform a majority vote. The final bagging model is usually used to improve the stability and robustness of otherwise unstable methods like the decision tree. This is especially true on noisy training data.
Stacking finally means to create several models, and then train a final model on the predictions of the first group of models.
Best regards,
Marius