The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
Information gain and numerical attributes
IngoRM
Employee-RapidMiner, RapidMiner Certified Analyst, RapidMiner Certified Expert, Community Manager, RMResearcher, Member, University Professor Posts: 1,751 RM Founder
Original message from SourceForge forum at http://sourceforge.net/forum/forum.php?thread_id=2043728&;forum_id=390413
Hi,
how does RapidMiner handle numerical attributes and information gain calculation for feature seletion? Is every occuring value used or does RM calculate several "bins"?
Answer by Ingo Mierswa:
Hello,
do you refer to the InfoGainWeighting operator or the information gain calculation inside of a decision tree learner?
> Is every occuring value used or does RM calculate several "bins"?
Both is possible. If you discretize the values first with one of the discretization operators, these bins are used. If not, RM tries all possible split points.
Cheers,
Ingo
Answer by topic starter:
Hi,
I was refering to the InfoGainWeighting operator which is used for feature selection.
Hi,
how does RapidMiner handle numerical attributes and information gain calculation for feature seletion? Is every occuring value used or does RM calculate several "bins"?
Answer by Ingo Mierswa:
Hello,
do you refer to the InfoGainWeighting operator or the information gain calculation inside of a decision tree learner?
> Is every occuring value used or does RM calculate several "bins"?
Both is possible. If you discretize the values first with one of the discretization operators, these bins are used. If not, RM tries all possible split points.
Cheers,
Ingo
Answer by topic starter:
Hi,
I was refering to the InfoGainWeighting operator which is used for feature selection.
0
Answers
Cheers,
Ingo