The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
Gradient boosting weights
k_vishnu772
Member Posts: 34 Learner III
Hi all,
I ran the model with gradient boosting algorithm in rapid miner and I have seen the weights generated for each input parameter and some of them have zero weight does that mean that those are eliminated from the model.does that mean it feature selects the parameters with positive weight?
Could you please help me in this.
Regards
Vishnu
I ran the model with gradient boosting algorithm in rapid miner and I have seen the weights generated for each input parameter and some of them have zero weight does that mean that those are eliminated from the model.does that mean it feature selects the parameters with positive weight?
Could you please help me in this.
Regards
Vishnu
0
Best Answer
-
MartinLiebig Administrator, Moderator, Employee-RapidMiner, RapidMiner Certified Analyst, RapidMiner Certified Expert, University Professor Posts: 3,533 RM Data Scientist
Hi,
yes and yes, i would call it feature selection. I often use this (or the weights of an RF) for feature selection. Just use a Select by Weights operator afterwards.
Best,
Martin
- Sr. Director Data Solutions, Altair RapidMiner -
Dortmund, Germany1
Answers
Hi @k_vishnu772,
the weights are caculated in the aftermath. Basically you run over all trees an calculate the influence of each cut and sum over it. A value of 0 in the weights indicates, that this attribute was never used for any split.
Cheers!
Martin
Dortmund, Germany
@mschmitz
so attibute of weight zero means even if i remove those in the model i should be able to get the same results right ?
and it is a kind of feature selection