Interpreting cross validation
Hi guys!
I'm a total beginner, so please bear with me.
I have a process set upped, with a cross validation at the end. Inside of it a have Deep Learning, Apply Model and Performance operators. So far so good, so after I run (4h later ) I get a Confusion Matrix and a Accuracy. And here is my question:
So I have accuracy: 35.42% +/- 47.83% (mikro: 35.42%)
Accuracy is average accuracy of all models trained, right?
So is +/- 47.83% variance?
And for the confusion matrix, is it from the last model trained, or is it a some kind of summary of all the runs?
To be exact I use k-fold cross validation, so maybe my understanding of that process is wrong.
Thx in advance and sorry for noob question!
Answers
Hi,
You are mostly right. The average accuracy of all applications of your model is 35%. The standard deviation of your k applications is 47%. The confusion matrix is the confusion matrix of all examples. So it's kind of the sum of all indivudial confusion matricies.
Â
~Martin
Dortmund, Germany