Classification Metrics Calculator

Calculate precision, recall, F1-score, and other classification metrics

Input Data
Enter your classification results
Predicted
Actual
Positive
Negative
Positive
Negative
Calculated Metrics
Performance metrics based on your inputs
0.00%
Precision
0.00%
Recall
0.00%
F1-Score
0.00%
Accuracy
0.00%
Specificity
0.00%
F-1 Score
Metrics Comparison
Visual comparison of key performance metrics
Precision-Recall Curve
Trade-off between precision and recall at different thresholds
ROC Curve
True Positive Rate vs False Positive Rate
F-Beta Curve
How F-score changes with different beta values
Understanding Classification Metrics

Primary Metrics

  • Precision

    Measures how many of the predicted positives are actually positive. High precision means low false positive rate.

  • Recall (Sensitivity)

    Measures how many of the actual positives were correctly identified. High recall means low false negative rate.

  • F1-Score

    Harmonic mean of precision and recall, providing a balance between the two metrics.

  • Accuracy

    Proportion of all predictions that were correct. Can be misleading with imbalanced datasets.

Advanced Metrics

  • Specificity

    Measures how many of the actual negatives were correctly identified.

  • F-Beta Score

    Weighted harmonic mean of precision and recall. Beta parameter determines the weight of recall relative to precision.

  • Matthews Correlation Coefficient (MCC)

    A balanced measure that works well even with imbalanced classes. Range is from -1 to 1.

  • Balanced Accuracy

    Average of recall and specificity. Useful for imbalanced datasets.