Precision@K

Objective:

Precision@K measures how many of the top K retrieved or generated items are relevant to a query. It is a rank-based evaluation metric commonly used in information retrieval and recommendation systems. A higher Precision@K score means the model has higher relevance among its top K results, indicating strong early retrieval performance, particularly when precision in high-rank positions is crucial.

Required Columns in Dataset:

Prompt, Ranked Context, Labeled Text

Interpretation:

  • High Precision@K: Shows that the top K results are highly relevant to the query, indicating the model's effectiveness in prioritizing the most appropriate outputs early in the ranking.

  • Low Precision@K: Suggests that the top K results contain irrelevant information, reflecting poor retrieval performance in terms of precision.

Execution via UI:

Execution via SDK:

Precision_K doesn't require LLM for computation.

metrics=[
    {"name": "Precision_K", "column_name": "your-text", "schema_mapping": schema_mapping}
]

Last updated

Was this helpful?