Search code examples
pythontensorflowmachine-learninginformation-retrieval

In tensorflow.metrics, difference between precision_at_k and precision_at_top_k?


In the tensorflow python API, tf.metrics features a few metrics for Information Retrieval.

In particular:

  • tf.precision_at_k and tf.precision_at_top_k
  • tf.recall_at_k and tf.recall_at_top_k

What is the difference between the _at_k and _at_top_k metrics?

The API documentation does not seem to give information on this.


Solution

  • Looking at their implementation, precision_at_k is a simple wrapper around precision_at_top_k. The difference is actually mentioned in the API docs: precision_at_k expects a tensor of logits as predictions whereas precision_at_top_k expects the predictions to be the indices of the top k classes. In essence, precision_at_k simply performs tf.nn.top_k on the predictions and then calls precision_at_top_k.