Classification Metrics Calculator
Calculate precision, recall, F1-score, and other classification metrics
Primary Metrics
- Precision
Measures how many of the predicted positives are actually positive. High precision means low false positive rate.
- Recall (Sensitivity)
Measures how many of the actual positives were correctly identified. High recall means low false negative rate.
- F1-Score
Harmonic mean of precision and recall, providing a balance between the two metrics.
- Accuracy
Proportion of all predictions that were correct. Can be misleading with imbalanced datasets.
Advanced Metrics
- Specificity
Measures how many of the actual negatives were correctly identified.
- F-Beta Score
Weighted harmonic mean of precision and recall. Beta parameter determines the weight of recall relative to precision.
- Matthews Correlation Coefficient (MCC)
A balanced measure that works well even with imbalanced classes. Range is from -1 to 1.
- Balanced Accuracy
Average of recall and specificity. Useful for imbalanced datasets.