The micro precision metric measures the ratio of the number of correct predictions over all classes to the number of total predictions.
Metric details
Copy link to section
Micro precision is a multi-label/class metric for generative AI quality evaluations that measures how well generative AI assets perform entity extraction tasks for multi-label/multi-class predictions.
Scope
Copy link to section
The micro precision metric evaluates generative AI assets only.
Types of AI assets: Prompt templates
Generative AI tasks: Entity extraction
Supported languages: English
Scores and values
Copy link to section
The micro precision metric indicates the ratio of the number of correct predictions over all classes to the number of total predictions. Higher scores indicate that predictions are more accurate.