The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
Regarding Kappa Value in Cross Validation
Hello,
Kappa values can be positive or negative depending on the type (level of agreement +ve, level of disagreement -ve). But in cross-validation how the final value of kappa is calculated? Are the negative values converted to positive and then the average kappa of all the folds are taken into consideration or are they are averaged directly?
Thanks,
Varun
Kappa values can be positive or negative depending on the type (level of agreement +ve, level of disagreement -ve). But in cross-validation how the final value of kappa is calculated? Are the negative values converted to positive and then the average kappa of all the folds are taken into consideration or are they are averaged directly?
Thanks,
Varun
Regards,
Varun
https://www.varunmandalapu.com/
Varun
https://www.varunmandalapu.com/
Be Safe. Follow precautions and Maintain Social Distancing
Tagged:
0
Best Answer
-
MartinLiebig Administrator, Moderator, Employee-RapidMiner, RapidMiner Certified Analyst, RapidMiner Certified Expert, University Professor Posts: 3,533 RM Data ScientistHi @varunm1 ,this is the code we use:
double pa = accuracy;
Where counter has the confusion matrix values.
double pe = 0.0d;
for (int i = 0; i < counter.length; i++) {
double row = 0.0d;
double column = 0.0d;
for (int j = 0; j < counter[i].length; j++) {
row += counter[i][j];
column += counter[j][i];
}
// pe += ((row * column) / Math.pow(total, counter.length));
pe += row * column / (total * total);
}
return (pa - pe) / (1.0d - pe);Does this make sense to you?BR,Martin- Sr. Director Data Solutions, Altair RapidMiner -
Dortmund, Germany6
Answers
Thanks for your response. I see that this kappa (Cohen) formula is combining all confusion matrices into one to get final kappa value in cross-validation. Is my understanding correct? Generally, we take the arithmetic mean of all individual folds, so I was confused how it's working here.
Thanks,
Varun
Varun
https://www.varunmandalapu.com/
Be Safe. Follow precautions and Maintain Social Distancing
nono. This is basically the formula to get one fold's Kappa. In X-val it is again aggregated.
BR,
Martin
Dortmund, Germany
Thanks,
Varun
Varun
https://www.varunmandalapu.com/
Be Safe. Follow precautions and Maintain Social Distancing