<p>你试过<code>tf.contrib.estimator.add_metrics(estimator, metric_fn)</code>(<a href="https://www.tensorflow.org/api_docs/python/tf/contrib/estimator/add_metrics" rel="nofollow noreferrer">doc</a>)吗?它接受一个初始化的估计器(可以预先封装)并将<code>metric_fn</code>定义的度量添加到它。在</p>
<p>使用示例:</p>
<pre class="lang-python prettyprint-override"><code>def custom_metric(labels, predictions):
# This function will be called by the Estimator, passing its predictions.
# Let's suppose you want to add the "mean" metric...
# Accessing the class predictions (careful, the key name may change from one canned Estimator to another)
predicted_classes = predictions["class_ids"]
# Defining the metric (value and update tensors):
custom_metric = tf.metrics.mean(labels, predicted_classes, name="custom_metric")
# Returning as a dict:
return {"custom_metric": custom_metric}
# Initializing your canned Estimator:
classifier = tf.estimator.DNNClassifier(feature_columns=columns_feat, hidden_units=[10, 10], n_classes=NUM_CLASSES)
# Adding your custom metrics:
classifier = tf.contrib.estimator.add_metrics(classifier, custom_metric)
# Training/Evaluating:
tf.logging.set_verbosity(tf.logging.INFO) # Just to have some logs to display for demonstration
train_spec = tf.estimator.TrainSpec(input_fn=lambda:your_train_dataset_function(),
max_steps=TRAIN_STEPS)
eval_spec=tf.estimator.EvalSpec(input_fn=lambda:your_test_dataset_function(),
steps=EVAL_STEPS,
start_delay_secs=EVAL_DELAY,
throttle_secs=EVAL_INTERVAL)
tf.estimator.train_and_evaluate(classifier, train_spec, eval_spec)
</code></pre>
<p>日志:</p>
^{pr2}$
<p>如您所见,<code>custom_metric</code>将与默认度量和损失一起返回。在</p>