Understanding a confusion matrix
Updated on 02.12.24
1 minute to read
Copy link
Overview
SEON uses confusion matrices to determine the accuracy of machine learning rules. You can also test certain new custom rules to see their effects before enabling them on your account.
Confusion Matrices Explained
A confusion matrix, also known as an error matrix, is a specific table layout that can be used to visualize the performance of an algorithm. Confusion matrices allow for the calculation of true positive and true negative rates as opposed to simple rule accuracy, which can be misleading.
| Predicted: No | Predicted: Yes |
Actual: No | True Negative | False Positive |
Actual: Yes | False Negative | True Positive |
Hands on with a confusion matrix
Consider the following scenario. SEON's machine learning system identifies a pattern associated with a fraudster and suggests a rule for it. It searches through all transactions in your account within the specified timeframe that match these attributes. It then identifies 165 that match the suspicious pattern (n = 165). However, based on the states of these historical transactions, only 105 are actually fraud, while the other 60 are legitimate.
Running the rule in the Rule Tester flags 110 transactions as fraudulent. 10 belong to legitimate users (false positives), while 100 of them are fraudsters. The rule, meanwhile, only misses 5 fraud attempts (false negatives).
N = 165 | Predicted: No | Predicted: Yes |
Actual: No | 50 | 10 |
Actual: Yes | 5 | 100 |
Based on this, you can calculate the accuracy and the misclassification rate:
- Accuracy: Overall, how often is the rule correct?
(TP+TN)/total = (100+50)/165 = 0.91 - Misclassification Rate: Overall, how often is it wrong?
(FP+FN)/total = (10+5)/165 = 0.09
equivalent to (1 - Accuracy) also known as "Error Rate"
Learn more
Dive into the details of where you'll encounter confusion matrices in SEON.