Scikit-learn: How to obtain True Positive, True Negative, False Positive and False Negative Scikit-learn: How to obtain True Positive, True Negative, False Positive and False Negative python python

Scikit-learn: How to obtain True Positive, True Negative, False Positive and False Negative


For the multi-class case, everything you need can be found from the confusion matrix. For example, if your confusion matrix looks like this:

confusion matrix

Then what you're looking for, per class, can be found like this:

overlay

Using pandas/numpy, you can do this for all classes at once like so:

FP = confusion_matrix.sum(axis=0) - np.diag(confusion_matrix)  FN = confusion_matrix.sum(axis=1) - np.diag(confusion_matrix)TP = np.diag(confusion_matrix)TN = confusion_matrix.values.sum() - (FP + FN + TP)# Sensitivity, hit rate, recall, or true positive rateTPR = TP/(TP+FN)# Specificity or true negative rateTNR = TN/(TN+FP) # Precision or positive predictive valuePPV = TP/(TP+FP)# Negative predictive valueNPV = TN/(TN+FN)# Fall out or false positive rateFPR = FP/(FP+TN)# False negative rateFNR = FN/(TP+FN)# False discovery rateFDR = FP/(TP+FP)# Overall accuracyACC = (TP+TN)/(TP+FP+FN+TN)


If you have two lists that have the predicted and actual values; as it appears you do, you can pass them to a function that will calculate TP, FP, TN, FN with something like this:

def perf_measure(y_actual, y_hat):    TP = 0    FP = 0    TN = 0    FN = 0    for i in range(len(y_hat)):         if y_actual[i]==y_hat[i]==1:           TP += 1        if y_hat[i]==1 and y_actual[i]!=y_hat[i]:           FP += 1        if y_actual[i]==y_hat[i]==0:           TN += 1        if y_hat[i]==0 and y_actual[i]!=y_hat[i]:           FN += 1    return(TP, FP, TN, FN)

From here I think you will be able to calculate rates of interest to you, and other performance measure like specificity and sensitivity.


According to scikit-learn documentation,

http://scikit-learn.org/stable/modules/generated/sklearn.metrics.confusion_matrix.html#sklearn.metrics.confusion_matrix

By definition a confusion matrix C is such that C[i, j] is equal to the number of observations known to be in group i but predicted to be in group j.

Thus in binary classification, the count of true negatives is C[0,0], false negatives is C[1,0], true positives is C[1,1] and false positives is C[0,1].

CM = confusion_matrix(y_true, y_pred)TN = CM[0][0]FN = CM[1][0]TP = CM[1][1]FP = CM[0][1]