Metrics
compute_confusion_matrix(preds, labels, n_labels)
¤
Function for computing a confusion matrix.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
preds |
numpy.ndarray
|
A NumPy array of predictions formatted with shape (n_samples, n_labels). Provided by NeuralNetwork. |
required |
labels |
numpy.ndarray
|
Classification list with One-Hot Encoding. Provided by input_interface. |
required |
n_labels |
int
|
Number of classes. Provided by input_interface. |
required |
Returns:
Name | Type | Description |
---|---|---|
rawcm |
numpy.ndarray
|
NumPy matrix with shape (n_labels, n_labels). |
Source code in aucmedi/evaluation/metrics.py
104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 |
|
compute_metrics(preds, labels, n_labels, threshold=None)
¤
Function for computing various classification metrics.
Computed Metrics
F1, Accuracy, Sensitivity, Specificity, AUROC (AUC), Precision, FPR, FNR, FDR, TruePositives, TrueNegatives, FalsePositives, FalseNegatives
Parameters:
Name | Type | Description | Default |
---|---|---|---|
preds |
numpy.ndarray
|
A NumPy array of predictions formatted with shape (n_samples, n_labels). Provided by NeuralNetwork. |
required |
labels |
numpy.ndarray
|
Classification list with One-Hot Encoding. Provided by input_interface. |
required |
n_labels |
int
|
Number of classes. Provided by input_interface. |
required |
threshold |
float
|
Only required for multi_label data. Threshold value if prediction is positive. |
None
|
Returns:
Name | Type | Description |
---|---|---|
metrics |
pandas.DataFrame
|
Dataframe containing all computed metrics (except ROC). |
Source code in aucmedi/evaluation/metrics.py
30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 |
|
compute_roc(preds, labels, n_labels)
¤
Function for computing the data data of a ROC curve (FPR and TPR).
Parameters:
Name | Type | Description | Default |
---|---|---|---|
preds |
numpy.ndarray
|
A NumPy array of predictions formatted with shape (n_samples, n_labels). Provided by NeuralNetwork. |
required |
labels |
numpy.ndarray
|
Classification list with One-Hot Encoding. Provided by input_interface. |
required |
n_labels |
int
|
Number of classes. Provided by input_interface. |
required |
Returns:
Name | Type | Description |
---|---|---|
fpr_list |
list of list
|
List containing a list of false positive rate points for each class. Shape: (n_labels, tpr_coords). |
tpr_list |
list of list
|
List containing a list of true positive rate points for each class. Shape: (n_labels, fpr_coords). |
Source code in aucmedi/evaluation/metrics.py
127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 |
|