New Extension: Interpretations - SHAP, LIME and Shapely
Dear Community
I am happy to announce that @pschlunder and I published a new extension to the marketplace: Interpretations!
So far RapidMiner users had the option to use Explain Predictions as their method to understand WHY a model predicted the way it did. The Explain Predictions operator uses an algorithm by @IngoRM which is focused around best speed, understand-ability and application on a range of use cases as well as data types.
The new operator adds the known algorithms of LIME, Shapely and SHAP to the mix. The operator Generate Interpretations has a very similar interface to the familiar Explain Predictions. In fact it also embeds Explain Predictions so that you can switch between different algorithms and get different ‘opinions’ on your predictions.
Please be aware that this is a first alpha release of the extension. We are continuously working on improving it. We appreciate every feedback!
Thanks!
Philipp & Martin
Dortmund, Germany
Comments
Dortmund, Germany
thank you for the interesting insight and the integration of the 'Generate Interpretation' operator into RapidMiner! This can be very useful!
Is there a possibility to train and test a Decision Tree with Cross Validation by additionally using the KernelSHAP? I implemented the following procedure but it doesn't work due to the missing connections:
Thank you in advance for your help!
Best regards,
Fatih
Dortmund, Germany
I recently posted this question. It would be a big help if you could take a look.
Thank you in advance,
Ana
Is it possible to use the generate interpretation operator with a model stored in the repository? I can't seem to make it work
Thanks,
Ana
Dortmund, Germany
I tried to use LIME, but I don't understand what does the value in circle means? Is there any guide to interpret the values in circle?
Thank you.
Dortmund, Germany
Dortmund, Germany
Dortmund, Germany
is there a way to connect it to the XGBoost Operator in Rapidminer? While e.g. GBM works fine, XGBoost doesn’t. Seems to be a type/interface issue. Taking into account the great performance of XGBoost it would be a pity if we couldn’t make use of this great operator for XGBoost.
Thanks
Erwin
Dortmund, Germany