The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
Performance Vectors (and Confusion Matrix)
Hello everyone,
I have a process in which I train a classifier and test its performance afterwards. My example set has a binominal label and when I output the data generated by the binominal performance evaluator I get a nice view with information about (in this case) accuracy, precision and recall.
My problem is that the wrong label value is treated as positive class so is there an easy way to change that? Or is there an easy way to access the elements of the confusion matrix?
I have a process in which I train a classifier and test its performance afterwards. My example set has a binominal label and when I output the data generated by the binominal performance evaluator I get a nice view with information about (in this case) accuracy, precision and recall.
My problem is that the wrong label value is treated as positive class so is there an easy way to change that? Or is there an easy way to access the elements of the confusion matrix?
0
Answers
just use the remap binominals operator to define which value should be positive and which should be treated as negative.
Greetings,
Sebastian
I also noticed that I can read the desired sample counts from the Performance Vector if I set the right flags in the Performance Evaluator Operator beforehand.