The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
"Problem with Simple Decision Tree Analysis"
Legacy User
Member Posts: 0 Newbie
I just downloaded RapidMiner and am evaluating it for some data mining work we are doing. I went through the tutorials and choose a a simple decision tree exercise with some actual usage data that I have (1300 users, 16 attributes, predict use of one tool). I get the Use variable split (100/1200) as a result of the DecisionTree operator but no partitions. Have checked and data is being read fine, parameters all look consistent witn the tutorial examples. I must be missing something obvious.
Any help gratefully accepted,
Dr. Richard Y Flanagan
Rohm and Haas Company
Any help gratefully accepted,
Dr. Richard Y Flanagan
Rohm and Haas Company
Tagged:
0
Answers
welcome to our forum! Do I understand you right, in that the decision tree you get consists of only one node, i.e. the root? You may force the decision tree learner to prune less by setting the parameters of the learner appropriately. You get some help concerning the parameters from the tooltips displayed when hovering your mouse over the parameters. Although there is normally no general rule to setting these parameters (i.e. the optimal parameter settings depend crucially on your data), you may try be setting the parameter [tt]minimal_leaf_size[/tt] to a smaller value and set the parameter [tt]no_pruning[/tt] to true.
Hope this was helpful in someway.
Regards,
Tobias