The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here

My honest testing performance is greater than training performance? is it luck or incorrect

Tsar1131Tsar1131 Member Posts: 3 Learner I
edited August 2020 in Help
I have used split operator on my data source in the ratio of 4:1 for training and honest testing I am using DT and cross-validation.
The performance result for testing is accuracy 93.21%, kappa 0.863 and for training accuracy 93.97%, kappa 0.695 
I need to know whether the model is underfitting data and how should I conclude this result 
Tagged:

Best Answer

Answers

  • MartinLiebigMartinLiebig Administrator, Moderator, Employee-RapidMiner, RapidMiner Certified Analyst, RapidMiner Certified Expert, University Professor Posts: 3,533 RM Data Scientist
    Hi,
    I would use a cross validation to check for the std_dev of the performance. Then you can see how lucky you are.

    Best,
    Martin
    - Sr. Director Data Solutions, Altair RapidMiner -
    Dortmund, Germany
  • Tsar1131Tsar1131 Member Posts: 3 Learner I
    hi @mschmitz
    it say +/- 0.31%
  • Tsar1131Tsar1131 Member Posts: 3 Learner I
    edited August 2020
    Thanks @mschmitz
    can you comment on the value of kappa that my model is producing. I'm curious to know why it is high for honest testing compared to training. 
    Thanks again
  • MartinLiebigMartinLiebig Administrator, Moderator, Employee-RapidMiner, RapidMiner Certified Analyst, RapidMiner Certified Expert, University Professor Posts: 3,533 RM Data Scientist
    I would do the same trick there :)
    But generally, if testing is better than training its rather unproblematic
    - Sr. Director Data Solutions, Altair RapidMiner -
    Dortmund, Germany
Sign In or Register to comment.