The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
How to Change Prediction Graphs in Model Simulator
Hi to all,
I use a decision tree model. I used two datasets (one is preprocessed, other is raw) to reveal the performance differences. Both datasets label the cases as 1 and controls as 0.
When I run my model with raw data set, the Model Simulator's output gives information about Prediction:0. Like "The outcome is most likely 0, but the model is not very confident. In fact, the confidence for this decision is only 80.00% ... "
Then I run the model with the preprocessed dataset. The Model Simulator's output gives information about Prediction:1. Like "The outcome is most likely 1, but the model is not very confident. In fact, the confidence for this decision is only 54.95%."
Is there a way to fix the model simulators output for only Prediction 1 ?
I use a decision tree model. I used two datasets (one is preprocessed, other is raw) to reveal the performance differences. Both datasets label the cases as 1 and controls as 0.
When I run my model with raw data set, the Model Simulator's output gives information about Prediction:0. Like "The outcome is most likely 0, but the model is not very confident. In fact, the confidence for this decision is only 80.00% ... "
Then I run the model with the preprocessed dataset. The Model Simulator's output gives information about Prediction:1. Like "The outcome is most likely 1, but the model is not very confident. In fact, the confidence for this decision is only 54.95%."
Is there a way to fix the model simulators output for only Prediction 1 ?
Tagged:
0
Answers
have you set the class of highest interest in the second step?
BR,
Martin
Dortmund, Germany