0 / 0
End-to-end model governance tutorial
End-to-end model governance tutorial

End-to-end model governance tutorial

This tutorial sets up an end-to-end model risk management solution with Watson OpenScale and IBM OpenPages. Your goal is to create a model and then set up monitoring to evaluate the model and its predictions. Use IBM OpenPages to create a model and track its progress through a workflow. Then, establish a subscription between IBM OpenPages and Watson OpenScale. Finally, use Watson OpenScale to evaluate the quality and fairness of model and to better understand how the model determined its predicted outcomes.

Tutorial overview

In this tutorial, you will perform the following tasks:

  • Create and deploy a model on IBM OpenPages for pre-implementation review.
  • Run a notebook to set up a connection between IBM OpenPages Watson OpenScale.
  • Evaluate a model using Watson OpenScale' metrics.

Before you begin

Complete the following steps to prepare for the tutorial:

Creating a model in IBM OpenPages

Create a model in IBM OpenPages to monitor risk, governance, and compliance.

Task 1: Create a model in IBM OpenPages

  1. From the IBM OpenPages menu, click Inventory > Models.
  2. Click New.
  3. Complete the following fields.
    Note: More fields appear after Machine Learning Model is set to Yes. {: note}
Field name Set to
Description Creating a new model for OpenPages
Model Status Proposed
Model Owner account name
Model or Non-Model Model
Machine Learning Model Yes
Monitored with Watson Studio Yes
Parent Entities organization’s name

Saving the form returns you to the Models home page.

Task 2: Move the model through the candidate workflow

  1. From the Models home page, click the created model.
  2. Click the Edit icon Edit icon to enter a Candidate comment. This field describes why the proposed model is a model and not a non-model. (can this come out)
  3. From the Action drop-down, click Submit Candidate for Confirmation and then click Continue. In a live workflow, this sends the candidate model to a reviewer for approval.
  4. From the Action drop-down, select Confirm Assessment. In a live workflow, a reviewer confirms that the model candidate is a model and the candidate workflow is complete.

Task 3: Move the model through the model development workflow to the pre-implementation review stage

Follow these steps to continue tracking the model through the candidate workflow

  1. From the Action drop-down, click Start Model Development.
  2. Complete the Required fields listed under All Key Items
  3. Under Model Development, enter your account name as the Developer.
  4. From the Action drop-down, click Assign to Developer and then click Confirm.
  5. From the Action drop-down, select Submit for Pre-Implementation Review.

Your model is now ready to evaluate with Watson OpenScale.

Running a notebook Watson OpenScale to IBM OpenPages

In IBM Watson Studio, create a project and run a notebook to perform the following setup tasks:

  • Create two machine learning models.
  • Connect Watson OpenScale to IBM OpenPages.
  • Create model deployments and configure monitors in Watson OpenScale.

Task 1: Create a pre-production project in Watson Studio

Your first task is to create a pre-production project to which you associate the Watson Machine Learning instance.

  1. Create a project where you can run the notebook, and name it MRM – Pre-prod.
  2. Associate MRM - Pre-prod with a Watson Machine Learning instance.

Make sure you are provisioned to use Watson Studio, Watson Machine Learning and Watson OpenScale. Create a project, where you can run the sample pipeline, and name it Credit risk. Create a deployment space, where you can view and test the results, and name it Credit risk - preproduction. Copy the space GUID from the Manage tab. You will need this when you run the notebook. Download the Sample training data file

Task 3: Add the sample notebook to the project

Use the notebook to set up a connection between Watson OpenScale and IBM OpenPages. This connection is necessary to create and deploy pre-production models, and to configure the model deployments in Watson OpenScale.

  1. Click the Assets tab in the MRM — pre-prod project, and then click New asset.
  2. Search and click Jupyter notebook editor.
  3. Select From file, and upload or click and drag the OpenScale and OpenPages model risk management on Cloud Pak for Data.ipynb notebook file.
  4. Click Create.

Task 4: Run the sample notebook

Follow these steps to run the notebook to create your pre-production model.

  1. From the Assets tab, open the MRM E2E with OpenPages on CP4D notebook.
  2. Since the notebook is in read-only mode, click the Edit Edit icon icon to place the notebook in edit mode.
  3. The first cell requires your input.
    1. For url, enter your Cloud Pak for Data hostname beginning with https://, and press Enter.
      Example: https://mycpdcluster.mycompany.com.
    2. For host, enter your Cloud Pak for Data username, and press Enter.
    3. For password, enter your Cloud Pak for Data username, and press Enter.
  4. Database credentials and schema require your input. Obtain your database credentials and schema name from your database administrator.
  5. For Integration system credentials (IBM OpenPages MRG), paste the URL, username, and password of the IBM OpenPages system that IBM provided.
  6. For Model ID, open your OpenPages model, and copy-and-paste the last digits of the URL.
    Example: https://mrgbeta.op-ibm.com/app/jspview/react/grc/task-view/7916?". The model ID is 7916.
  7. Run the notebook cell-by-cell. You can monitor the progress cell by cell, noticing the asterisk "In []" changing to a number, for example "In [1]".
    Note: Be sure to read the directions for each cell before running it and moving to the next cell. You might need to leave the notebook and perform other tasks. For example, you will need to manually add the deployed models to the
    OpenScale Insights dashboard* before running model monitors.
  8. To restart the notebook and clear the output. Click Kernel > Restart & Clear Output.

Viewing insights in Watson OpenScale

The previous model risk management notebook should have generated three tiles that you can view the Insights dashboard: GermanCreditRiskModelPreProduction, GermanCreditRiskModelChallenger, GermanCreditRiskModel. Use Watson OpenScale to evaluate the quality, fairness, and to understand how the models reached its predicted outcome.

Task 1: View the model monitors for fairness

The Watson OpenScale fairness monitor evaluates the fairness of your model. These quality metrics determine whether bias exists in the predicted outcomes.

  1. From the Insights dashboard, select a model.
  2. In the Fairness section, click the Configure icon. The fairness monitor uses disparate impact to determine fairness. Disparate impact compares the percentage of favorable outcomes for a monitored group to the percentage of favorable outcomes for a reference group.
  3. To return to the model details screen, click Go to model summary.
  4. To view more detailed fairness results, click the right arrow icon. Here, you can reconfigure the monitored attributes and data set and view how the graph changes.
  5. To return to model details, click the model's navigation trail.

Task 2: View the model monitors for quality

The Watson OpenScale quality monitor evaluates the quality of your model. These quality metrics determine how well your model predicts outcomes.

  1. In the Quality section, click the Configure icon. The quality monitor uses many metrics to determine quality. Detailed information on these metrics can be found at Quality metrics overview.
  2. To return to the model details screen, click Go to model summary.
  3. To view more detailed quality results, click the right arrow icon. Here, you see all the quality metric calculations and a confusion matrix showing correct model decisions along with false positives and false negatives.
  4. To return to model details, click the model's navigation trail.

Task 3: View the model monitors for explainability

It is important understand how and why the model determined its predicted outcome. Explainability identifies the features that positively and negatively impact a predicted outcome.

  1. From the Watson OpenScale navigator, click the Explain a transactionExplain a transaction icon.
  2. Select a model from the Deployed model list.
    The Recent transactions list displays all of the transactions that are processed by your model. Transaction list
  3. Click Explain in the Actions column.
    The Transaction details page provides an analysis and chart that of the features that positively and negatively influenced the predicted outcome of the transaction. Transaction details
  4. (Optional) For further analysis, click the Inspect tab.
    You can set new values to determine a different predicted outcome for the transaction. After you set new values, click Run analysis to show how different values can change the outcome of the transaction. Transaction details on the inspect tab show values that might produce a different outcome

Next steps

APIs, SDKs, and tutorials for Watson OpenScale

Generative AI search and answer
These answers are generated by a large language model in watsonx.ai based on content from the product documentation. Learn more