Tracking models with AI Factsheets
Use a model inventory with AI Factsheets to organize machine learning models for your organization as part of your AI Governance strategy. You can create an use case for every model you want to track, then review the details from the inventory page in the catalog. Use filters to refine the view. For example, view just the model use cases that are in an approved state.
To view a model inventory, choose Model inventory from the Catalog section of the navigation pane.
Working with model use cases
From the Model inventory, you can:
- Create a model use case
- View model use case details
- Change the status for a model use case
- Delete a model use case
- Searching or filtering to find model use cases
- View alerts
- View the factsheet for a model
- View the factsheet for a deployment
- Tracking details
Creating a model use case
Create a model use case to track a model's lifecycle.
- From the navigation pane, click the Catalog section and then click Model inventory.
- Click New model use case and enter a name and description.
- Choose the catalog containing the model use case.
- After you create the use case, you can specify extra details such as the Model purpose, Business terms, or Supporting documentation.
Viewing model use case details
A model inventory contains model use cases that would provide details about the model lifecycle, purpose, and facts that support governance. For each model use case, you can view:
- The model use case name.
- The status for the model use case. For example, a model might be awaiting development, under review, or in production.
- Business terms associated with the model use case.
- Tags available for the model use case.
For complete details on what is included for each model, see viewing model and deployment factsheets.
Changing the status of a model use case
You can change the status of a model use case to reflect a change in the model's lifecycle. For example, you can change the status of a model from Deployed for validation to Validated after you confirm that the model was adequately tested.
To change the status of a model use case:
- Open the model use case in the model inventory.
- Switch to the Asset tab.
- Click the Edit icon.
- Select the new status and click Update to save.
The model use case is displayed in the new status category.
Deleting a model use case
To delete a model use case from the inventory, click Remove from the action item on the model use case tile. The model use case is removed from the inventory and all association with the model in a project or space is deleted.
Searching or filtering to find models
- Use the search bar to search for a model use case by name, tag, classification, or business term.
- Use the filters in the model inventory view to refine the list. You can filter by alert (such as a change in drift, fairness or quality), business terms, catalog, classification, tags, and status.
When you evaluate a model, you can set thresholds to trigger an alert when a threshold is met. For example, you can set an alert that is triggered if bias detection for a monitored group changes by 10%. You can view the alerts for a model from the model use case.
If any of the model deployments has alerts, alerts are displayed in the model inventory as a badge on the model use case. Choose Preview alerts from the action menu item on a model use case in the model inventory page to see the details of the alerts.
Viewing the factsheet for a model
From the model tracking page for a model use case in an inventory, click the Asset tab to view the associated factsheet. You can also view the factsheet when you view model details in a project, space, or catalog. For details, see Viewing a model factsheet.
After a model is registered, facts about the model are captured and preserved automatically at each point in the model lifecycle, including whenever a model is updated or changed. For example, events are captured when a model is:
- Updated in any way, including having a tag or description revised
- Promoted to a space
- Published to a catalog
- Tested with a deployment
- Evaluated lifecycle in OpenScale
No user intervention is required to update the factsheet.
Additionally, facts are preserved about a model in a notebook after the IBM AI Governance Facts Client library (AIGovFactsClient) is started in the notebook. From that point, lifecycle metadata and metrics are automatically preserved.
Viewing the factsheet for a deployment
From the model details page for a model use case in an inventory, click the Asset tab to view the associated factsheet. You can also view the factsheet when you view deployment details in a space. For details, see Viewing a model factsheet.
When tracking is enabled, the recording of facts is triggered by actions on the model or related assets. Similarly, changes in the AI lifecycle control where the model display in a model use case, to show the progression from development to operation. For example, when a model is saved in a project, the model displays in the Develop pillar. When the model is promoted to a space, the use case is updated to show the model in the Test pillar, and so on.
|Space tagged with AIGovernance: Pre-production||✓|
|Space tagged with AIGovernance: Production||✓|
|Watson OpenScale Pre-production||✓|
|Watson OpenScale Production||✓|
These notes describe how the interaction between the lifecycle components controls where a Watson Machine Learning displays in a model use case.
When a model is promoted to a space, the model displays in the Test pillar. If the deployment for that model is evaluated in Watson OpenScale and tagged as pre-production, then the model displays in Validate rather than Test. If the model deployment is tagged as production instead of pre-production, then it moves to the Operate pillar.
If you assign the space tag
AIGovernance: Production, the model displays in the Validate or Operate pillar, respectively.Note: This change is triggered by an action resulting in a change to the model metadata, such as updating the model name, description, or tags, if the tag is added to the space after the model is tracked.
If a model is promoted to a space where the space has tag
AIGovernance: Pre-production, the model displays in the Validate pillar. If the deployment for that model is evaluated in Watson OpenScale and tagged as pre-production, then the model displays in the Validate pillar. If the model deployment is tagged as production instead of pre-production, then it moves to the Operate pillar.
If a model is promoted to a space where the space has the tag
AIGovernance: Production, it displays in the Operate pillar. If the deployment for this model is evaluated in Watson OpenScale and tagged as production, then it will be still in Operate pillar. But if it is tagged as Pre-production it displays in the Validate pillar.
If a model deployment is not evaluated in Watson OpenScale, it will have a
Pending Evaluationtag. If the model deployment is evaluated, it will have an
Evaluatedtag. If the model deployment is approved after evaluation in Watson OpenScale, it will have an