The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here

Adaboost Decision Stump

MSTRPCMSTRPC Member Posts: 2 Learner I
Hey all,
I have a Question about Decision Stumps in the Adaboost Algorithm, because in Literature it is recommended to use a "Weak Learner".

First I implemented an Decision Stump in the Adaboost Operator with 10 Iterations, but the Trees looked identical and my results weren't as expected. I saw that in the Tutorial Process of the Adaboost Algorithm ist used an Decision Tree with a Depth of 10. But isn't the advantage of Adaboost, that you use weak learner to get better results through iterative learning? 

With the default Decision Tree the Results are good, but I don't understand why a normal Decision Tree can be used here.


After that Process I got the Precision of the Model and in the Results there is a "w" with a value, ist this the Sum of the weights per Stump? I couldn't find any explanation. Sorry if this Question is obsolete, I am not that long into Rapidminer.


Greetings :smile:

MSTRPC
Tagged:

Best Answer

Answers

  • MSTRPCMSTRPC Member Posts: 2 Learner I
    Hello, 

    thank you for the answer, this really helps me solving the problem :)

    Greetings,

    MSTRPC
Sign In or Register to comment.