The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here

how to interpret those different performances?

Fred12Fred12 Member Posts: 344 Unicorn
edited June 2019 in Help

hi,

I was using different operator settings using Boosting and Bagging with WJ48 and Random Forests...

I  basically used an optimize parameter gridsearch, inside it X-Validation, inside that MetaCost operator and AdaBoost or Bagging operator with WJ48 or Random Forest operator inside them....

 

now I get different performances, I use 70% for training, 30% for testing...:

for AdaBoost with MetaCost and WJ48 Decision Tree I get:

Unbenannt.PNG

Bagging with MetaCost and WJ48:

Unbenannt2.PNG

Bagging with MetaCost and Random Forest:

Unbenannt3.PNG

 

now which one of them is most representative? should I use 70 / 30 for cross validation? or something like 50/50 ?

In the last one, I get 83,7% accuracy, however, class 4 recall is only 60%, does it mean I should focus more on that (and therefore, this result is not optimal)?

whereas in the first example, recall is all about 75% for class 4 and 3 and >90% for class 1, and precision is all above 80%, but pred.4 is around 78% only,

but in the last performance , precision is all around 85.6% for pred.4 and 86.7% for pred.3 ...


2nd question: Is MetaCost with Boosting necessary? as I understood, there is already an implicit weighting that weights falsely classified examples more than others...

last question: Can I put more than 1 classifier into AdaBoost and Bagging? (e.g Decision Tree and Naive Bayes or SVM)? 

Tagged:

Answers

  • Fred12Fred12 Member Posts: 344 Unicorn

    edit: I noticed when I created the post, the last 1/3 of my post where cut off.. it didn't appear in the thread... is this a bug? it already happened to me twice, when I have more than 2 pictures in there...

  • MartinLiebigMartinLiebig Administrator, Moderator, Employee-RapidMiner, RapidMiner Certified Analyst, RapidMiner Certified Expert, University Professor Posts: 3,533 RM Data Scientist

    Fred,

     

    i would say they are all the same. Go for a cross validation and have a look at the std_dev. Then you will propably see that they are comparable in their variances.

    Are you sure you want to focus on accuracy?

     

    For the potential bug: Please consult @stevefarr he can help you.

     

    ~Martin

    - Sr. Director Data Solutions, Altair RapidMiner -
    Dortmund, Germany
  • bhupendra_patilbhupendra_patil Employee-RapidMiner, Member Posts: 168 RM Data Scientist

    this may be a lame advice, but make sure to use same "local random seed" for the all the validation, if you leave it at default, you may be training and testing against totally different combinations .. would not impact significantly in many cases,  but there is always that one edge case

  • stevefarrstevefarr Member Posts: 93 Maven

    Hi Fred

    Sorry you are having this issue. Have not come across it before.

     

    I just successfully posted a message with 4 images on it. The system settings allow users to add up to 1000 images each.So, I guess something else is causing the issue.

     

    1) The permitted image file types are:  *.jpg;*.JPG;*.jpeg;*.JPEG;*.gif;*.GIF;*.png;*.PNG,*.pptx

    2) The maximum file size is 10000kB

     

    Could either of these cause the issue?

     

    If not, could you point me to the post that isn't working and send the images you want in the post to community@rapidminer..com.  I can then try to replicate the post to eliminate the possibility that my rights on the system differ from yours in the attachment of images. 

     

    Thanks

     

    Steve Farr

  • Fred12Fred12 Member Posts: 344 Unicorn

    @stevefarr hm thats weird, I used 3 png images with 10 kb... when I posted first, the lower part of my text with the last image was missing...

    I used the "back" button on my browser, copied that part out again,  edited my post and pasted it in again, happened to me 2nd time, I cannot explain why...

     

    maybe it could also be display error from my browser, at least I couldn't see the last part of my post....

  • Fred12Fred12 Member Posts: 344 Unicorn

    ok thanks, @mschmitz, why do you mention "are you sure to focus on accuracy" ? Should I use another performance metric ? can you clarify on that?

  • Thomas_OttThomas_Ott RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 1,761 Unicorn

    @mschmitz raises a good point, you should'nt solely focus on Accuracy IMHO.  I certainly look at it and the Std dev when comparing differnet performance vectors but in cases on binomimal classification, I also look at AUC and even Kappa. I believe Martin has a link to a research paper on the discussion of AUC under the PR curve(?), which is another interesting measure in model evaluation. 

  • Fred12Fred12 Member Posts: 344 Unicorn

    @Thomas_Ott

    ok, but where can I see std dev in performance operator? there is no field for that..

    and I can't use ROC or AUC, because its 3 classes , would have to use Multinomial to binomial classification or something similar...

  • Thomas_OttThomas_Ott RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 1,761 Unicorn

    You will get a std dev from you accuracy if you use X-Val. The PNG's you posted show me that you're not using X-val at all.  You will see something like: Accuracy 70% (+/- 5%)

     

    XVAL STD DEV.png

  • Fred12Fred12 Member Posts: 344 Unicorn

    hmm.. thats really weird, because I was using X-Validation operator all the time...

     

    in this screenshot its there...no idea why its not in the others ?!

    Unbenannt.PNG

Sign In or Register to comment.