Python 如何在 Keras 中计算准确率和召回率
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/43076609/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How to calculate precision and recall in Keras
提问by Jimmy Du
I am building a multi-class classifier with Keras 2.02 (with Tensorflow backend),and I do not know how to calculate precision and recall in Keras. Please help me.
我正在用 Keras 2.02(带有 Tensorflow 后端)构建一个多类分类器,我不知道如何在 Keras 中计算精度和召回率。请帮我。
回答by Yasha Bubnov
Python package keras-metricscould be useful for this (I'm the package's author).
Python 包keras-metrics对此可能很有用(我是该包的作者)。
import keras
import keras_metrics
model = models.Sequential()
model.add(keras.layers.Dense(1, activation="sigmoid", input_dim=2))
model.add(keras.layers.Dense(1, activation="softmax"))
model.compile(optimizer="sgd",
loss="binary_crossentropy",
metrics=[keras_metrics.precision(), keras_metrics.recall()])
UPDATE: Starting with Keras
version 2.3.0
, such metrics as precision, recall, etc. are provided within library distribution package.
更新:从Keras
version开始,2.3.0
库分发包中提供了诸如精度、召回率等指标。
The usage is the following:
用法如下:
model.compile(optimizer="sgd",
loss="binary_crossentropy",
metrics=[keras.metrics.Precision(), keras.metrics.Recall()])
回答by Dref360
As of Keras 2.0, precision and recall were removed from the master branch. You will have to implement them yourself. Follow this guide to create custom metrics : Here.
从 Keras 2.0 开始,从 master 分支中删除了精度和召回率。您必须自己实现它们。按照本指南创建自定义指标:此处。
Precision and recall equation can be found Here
精确率和召回率方程可以在这里找到
Or reuse the code from keras before it was removed Here.
或者在删除之前重用来自 keras 的代码这里。
There metrics were remove because they were batch-wise so the value may or may not be correct.
删除了指标,因为它们是批量的,因此该值可能正确也可能不正确。
回答by vogdb
My answer is based on the comment of Keras GH issue. It calculates validation precision and recall at every epoch for a onehot-encoded classification task. Also please look at this SO answerto see how it can be done with keras.backend
functionality.
我的回答是基于Keras GH issue的评论。它为 onehot 编码的分类任务计算每个时期的验证精度和召回率。另请查看此SO answer以了解如何使用keras.backend
功能来完成。
import keras as keras
import numpy as np
from keras.optimizers import SGD
from sklearn.metrics import precision_score, recall_score
model = keras.models.Sequential()
# ...
sgd = SGD(lr=0.001, momentum=0.9)
model.compile(optimizer=sgd, loss='categorical_crossentropy', metrics=['accuracy'])
class Metrics(keras.callbacks.Callback):
def on_train_begin(self, logs={}):
self._data = []
def on_epoch_end(self, batch, logs={}):
X_val, y_val = self.validation_data[0], self.validation_data[1]
y_predict = np.asarray(model.predict(X_val))
y_val = np.argmax(y_val, axis=1)
y_predict = np.argmax(y_predict, axis=1)
self._data.append({
'val_recall': recall_score(y_val, y_predict),
'val_precision': precision_score(y_val, y_predict),
})
return
def get_data(self):
return self._data
metrics = Metrics()
history = model.fit(X_train, y_train, epochs=100, validation_data=(X_val, y_val), callbacks=[metrics])
metrics.get_data()
回答by vsocrates
This thread is a little stale, but just in case it'll help someone landing here. If you are willing to upgrade to Keras v2.1.6, there has been a lot of work on getting stateful metrics to work though there seems to be more work that is being done (https://github.com/keras-team/keras/pull/9446).
这个线程有点陈旧,但以防万一它会帮助某人登陆这里。如果您愿意升级到 Keras v2.1.6,尽管似乎还有更多工作要做(https://github.com/keras-team/keras /拉/9446)。
Anyway, I found the best way to integrate precision/recall was using the custom metric that subclasses Layer
, shown by example in BinaryTruePositives.
无论如何,我发现集成精度/召回率的最佳方法是使用子类化的自定义指标,Layer
如BinaryTruePositives中的示例所示。
For recall, this would look like:
回想一下,这看起来像:
class Recall(keras.layers.Layer):
"""Stateful Metric to count the total recall over all batches.
Assumes predictions and targets of shape `(samples, 1)`.
# Arguments
name: String, name for the metric.
"""
def __init__(self, name='recall', **kwargs):
super(Recall, self).__init__(name=name, **kwargs)
self.stateful = True
self.recall = K.variable(value=0.0, dtype='float32')
self.true_positives = K.variable(value=0, dtype='int32')
self.false_negatives = K.variable(value=0, dtype='int32')
def reset_states(self):
K.set_value(self.recall, 0.0)
K.set_value(self.true_positives, 0)
K.set_value(self.false_negatives, 0)
def __call__(self, y_true, y_pred):
"""Computes the number of true positives in a batch.
# Arguments
y_true: Tensor, batch_wise labels
y_pred: Tensor, batch_wise predictions
# Returns
The total number of true positives seen this epoch at the
completion of the batch.
"""
y_true = K.cast(y_true, 'int32')
y_pred = K.cast(K.round(y_pred), 'int32')
# False negative calculations
y_true = K.cast(y_true, 'int32')
y_pred = K.cast(K.round(y_pred), 'int32')
false_neg = K.cast(K.sum(K.cast(K.greater(y_pred, y_true), 'int32')), 'int32')
current_false_neg = self.false_negatives * 1
self.add_update(K.update_add(self.false_negatives,
false_neg),
inputs=[y_true, y_pred])
# True positive calculations
correct_preds = K.cast(K.equal(y_pred, y_true), 'int32')
true_pos = K.cast(K.sum(correct_preds * y_true), 'int32')
current_true_pos = self.true_positives * 1
self.add_update(K.update_add(self.true_positives,
true_pos),
inputs=[y_true, y_pred])
# Combine
recall = (K.cast(self.true_positives, 'float32') / (K.cast(self.true_positives, 'float32') + K.cast(self.false_negatives, 'float32') + K.cast(K.epsilon(), 'float32')))
self.add_update(K.update(self.recall,
recall),
inputs=[y_true, y_pred])
return recall
回答by Jesús Utrera
Use Scikit Learn framework for this.
为此使用 Scikit Learn 框架。
from sklearn.metrics import classification_report
history = model.fit(x_train, y_train, batch_size=32, epochs=10, verbose=1, validation_data=(x_test, y_test), shuffle=True)
pred = model.predict(x_test, batch_size=32, verbose=1)
predicted = np.argmax(pred, axis=1)
report = classification_report(np.argmax(y_test, axis=1), predicted)
print(report)
This blogis very useful.
这个博客非常有用。