The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
"Recursive partitioning regression tree?"
I would like to discover explanatory variables for a numeric dependent variable (which conditions result in the shortest commute times on any given week day). Some of the explanatory variables are numeric, others are categoric. Does RapidMiner have any algorithms that can handle that?
I understand that recursive partitioning regression trees can. Is there an equivalent or alternative in RM?
I understand that recursive partitioning regression trees can. Is there an equivalent or alternative in RM?
Tagged:
0
Answers
RapidMiner's Decision Trees can only be used for classification problems, not for regression tasks. But they can deal with both nominal an numeric explanatory attributes.
If you want to stick to decision trees you can discretize the label (as we call the dependent variable) with one of the Discretize operators.
However, decision trees are probably not the best way to define the explanatory power of variables. For that you could e.g. try the Linear Regression and have a look at p-value and attribute weights of the resulting model, or use a Forward Selection around a decision tree.
Best regards,
Marius
if you install the Weka Extension for RapidMiner, you can use regression tree learners from Weka seamlessly within RapidMiner: For installing the Weka Extension, start RapidMiner, go to the Help menu, and then to the "Updates and Extensions (Marketplace)" submenu.
Cheers,
Ralf