The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
Gradient Boosted Trees: extract feature importance?
MariusHelf
RapidMiner Certified Expert, Member Posts: 1,869 Unicorn
Hi all,
is it possible to extract the feature importance from the GradientBoosted model?
Most comfortable would be a weights output on the operator in one of the next releases, but I'm sure it must also be possible with some Groovy code?
Unfotunately the description result of the model shows only the ~10 most/least important features, which is not enough if you have many features.
Cheers,
Marius
Tagged:
8
Comments
Hi Marius,
Good point, this is in fact something we are considering for one of our upcoming releases. I just raised the priority in our tracking system so hopefully it will make it into a release very soon.
Best, Zoltan
That's good news, thanks! Hope "one of next" means soon
Cheers,
Marius
Hi Marius,
this one is already part of the upcoming 7.3 release.
Logistic Regression, Gradient Boosted Trees and Generalized Linear Model all provide an attribute weights vector output.
(Extracting those with Groovy scripts is not possible due to security restrictions, if nothing else.)
Best,
Peter
@phellinger and @zprekopcsak, perhaps a more generalized operator to extract attribute importance weights from any model would be even more helpful? I know that there is not necessarily a single definition of how to determine attribute importance inside a multivariate model, but in theory, one simple approach is to take the list of all model attributes and remove them one at a time from the final model to see the resulting deterioration in model performance, and rank them accordingly (where the attribute that leads to the greatest decrease in performance has the highest weight, and all other attributes' weights are scaled to that). This can of course be done manually today (and even done with loops to cut down on repetitive operations), but it would be nice if RapidMiner added an operator to do this automatically for any model and output the resulting table as a set of weights. In my view this would answer a very common question from business users about attribute/variable importance in multivariate models.
Lindon Ventures
Data Science Consulting from Certified RapidMiner Experts
@Telcontar120: sounds like an interesting idea, but might be misleading. If you have two highly correlated attributes then removing one will not change performance at all, even though one of them may be needed for a good model.
Otherwise, I agree that explaining and adding narrative to a model is very important and we are considering various ways to improve there.