The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
Model Simulator - optimize input parameters
Hi to all,
I'm building a model by random forest. To find the best fitting model I use cross validation. According to the cross validation results my model has 74.35% accuracy and 0.830 AUC. These are good performance results for the model.
I also run the model simulator. The output of the model simulator list the contradicting and supporting parameters but indicates that my results are not confident. And shows the confidence level as 55%. I wonder how this confidence level is calculated and what does it indicate? How can AUC is high when the confidence level is this low ?
Another issue is the optimize option of the model simulator. In the output window of the model simulator there is an optimize button. This option optimize the input parameters in order to increase the confidence level. The confidence level of my model increased to 90%. When confidence level increases some of the contradicting and supporting parameters change. Is there a way to see the newly formed model (by the optimization) and its performance indicators?
Tagged:
0
Best Answer
-
IngoRM Employee-RapidMiner, RapidMiner Certified Analyst, RapidMiner Certified Expert, Community Manager, RMResearcher, Member, University Professor Posts: 1,751 RM FounderHi,Another issue is the optimize option of the model simulator. In the output window of the model simulator there is an optimize button. This option optimize the input parameters in order to increase the confidence level. The confidence level of my model increased to 90%.That's actually not really what happened The optimizations is merely trying input combinations so that the confidence is maximized (or whatever the target is). It does not really change the overall confidence levels for the model, it just looks for the case where the model is most sure one way or the other. Those settings are then used on the left side when you press on finished and the model's response is shown on the right side, including the new supporting and contradicting factors for the new model prediction.When confidence level increases some of the contradicting and supporting parameters change. Is there a way to see the newly formed model (by the optimization) and its performance indicators?As I said, the model is still the same. The optimization is done with the same model, only the input values are changes. Please see this here for more information: https://docs.rapidminer.com/8.2/studio/operators/scoring/model_simulator.htmlI use weight by tree importance to see the highest weight attributes( most relevant or important for the outcome) . Unfortunately the attributes do not match with the ones listed in the model simulator. For example the most relevant attribute found by model simulator is the 973 th. among the attribute weights. As vice versa the highest weight attributes are not listed in the model simulator output.
That can indeed happen since both are completely different approaches. One is using artificially generated data points to calculate the local importance of the different factors. The advantage is that this approach works for all model types. Mode on this here: https://docs.rapidminer.com/9.0/studio/operators/scoring/explain_predictions.html
The tree importance is taking the specific model into account, but obviously only works on tree-based models.
Hope this helps,
Ingo7
Answers
Regarding your first question:
And I'll let more knowledgeable RapidMiners elaborate on your second question
Vladimir
http://whatthefraud.wtf
Dortmund, Germany