Context Recall
Objective: This metric measures the ability to retrieve documents containing ground truth facts. Simply, it returns the proportion of context documents which had impact on the ground truth response.
Required Parameters: Prompt
, Expected Response
, Context
Interpretation: Higher score signifies major proportion of contexts supplied to the LLM were helpful in answering the prompt question
Code Execution:
The "schema_mapping" variable needs to be defined first and is a pre-requisite for evaluation runs. Learn how to set this variable here.
Example:
Prompt: What is the chemical formula for water and what are different elements in it?
Expected Response: The chemical formula for water is H2O and it is composed of two elements: hydrogen and oxygen.
Context: [‘Water is essential for all known forms of life and is a major component of the Earth's hydrosphere.’,‘Water chemical formula is H2O.’, ‘The chemical formula for carbon dioxide is CO2, which is a greenhouse gas.’]
Metric Output: {‘score’:0.5, ‘reason’:‘’context does not contain any information about the elements of water’}
Last updated