The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
[Solved] AUC for nominal features
aryan_hosseinza
Member Posts: 74 Contributor II
Hi ,
I am working with a decision tree as a binominal classifier , I want to measure AUC , but I am confused with the result , what threshold is changing, all features are nominal,
Thanks
Arian
I am working with a decision tree as a binominal classifier , I want to measure AUC , but I am confused with the result , what threshold is changing, all features are nominal,
Thanks
Arian
Tagged:
0
Answers
please post the process you have so far, and describe what exactly confuses you.
Best, Marius
When you want to draw AUC curve , you change a threshold and caluclate TPR and FPR for each threshold and you draw the corresponding curve, but what is threshold changing here ?
you are right about the threshold, however, the threshold is changed for the confidence of being of belonging to a certain class. This does never have anything to do with the type of the input attributes.
So if you have a decision tree, most leaves won't be pure. If such a leave consists of , let's say, 75% positive examples and 25% negatives, then a new examples which ends up in this leave has a confidence of 75% for being positive.
You are using bagging, that means you are growing several decision trees and then predict by majority vote. But the more of the decision trees make the same classification, the higher the confidence of the composed bagging model.
That means, that for decision trees and for bagging the confidences do not have a continuous range like svms or naive bayes, but only as many "steps" as there are leaves in the tree or classifiers in the bagging model.
Best, Marius
Thanks for your answer, but there arises a basic question , how does the bagging work with decision trees ? at each step , all decision trees vote on what ? I got little confused
Thanks ,
Arian
[/quote]
Thanks for your answer, but there arises a basic question , how does the bagging work with decision trees ? at each step , all decision trees vote on what ? I got little confused
[/quote]
Here we are talking about model application, obviously. So what bagging does you classify a new example is to pass it to each decision tree, and lets them make their decisions (i.e. classifications). Then, it collects the classifications and predicts the value which the majority of the trees predicted.
Hope this helps!
~Marius
I just checked it and found bagging doesn't make one final model ! I thought that would make one final model , but what about X-Validation , because it results one single final model although it trains several models on different portions of the input data, how does that work in Rapid Miner ?
if you connect the model output of the X-Validation, in addition to the 10 folds, it creates one more model on the complete data set. This is just for your convenience and has nothing to do with the performance estimation process.
Thanks