The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here

AutoModel Performance not Matching Confusion Matrix

DennisBalogluDennisBaloglu Member Posts: 11 Learner I
Hello,
For the AutoModel, I look at the output for the confusion matrix and performance measures. However, when calculating the performance measures by hand using the confusion matrix, it doesn't match up to the performance measures listed. Am I overlooking something?

Best Answer

  • lionelderkrikorlionelderkrikor RapidMiner Certified Analyst, Member Posts: 1,195 Unicorn
    Solution Accepted
    Hi @DennisBaloglu

    Yes, this slightly difference is expected : 

    The displayed performance is (by default) a "multi hold out set validation" method on the 40% of the dataset which are not used to train the model.
    Then this "test set" is divided in 7 parts, and 7 performances are calculated.
    Then AutoModel remove the maximum performance and the minimum performance (the outliers) and the average performance is calculated on the 5 remaining performance.
    Thus this calculated performance can slightly differ from the performance calculated from the confusion matrix.

    To retrieve the performance calculation methodology, on the results panel (the final screen), you can click on the "information mark" an go to "Models" -> "Performance" to see the description of this methodology

    Hope this helps,

    Regards,

    Lionel

Answers

Sign In or Register to comment.