some of metrics of systems like IR are precision and recall. However their definition is clear but I doubt when one system returns no output should we consider its precision 1 or zero. or should we discriminate if their is no gold answer or not for computing precision in this situation?
if this question is off-topic I'll appreciate if guide me where can I ask this?
thanks
Precision is defined as the fraction of relevant items among all retrieved. Precision also gives the probability that a retrieved item is relevant.
The formula is:
#(relevant items retrieved)
Precision = -------------------------------
#(retrieved items)
The value of precision varies from 0
(none among the retrieved items is relevant) to 1
(all of the retrieved items are relevant).
If a system doesn't retrieve any item then its precision is 0
.
I assume this happens only for a few queries, hopefully over a large number of queries the system will retrieve some results and then the precision formula will make more sense.