Use the ModelOps tools to manage your AI assets from development to production.
ModelOps explained
Copy link to section
MLOps synchronizes cadences between the application and model pipelines. It builds on these practices:
DevOps for bringing a machine learning model from creation through training, to deployment, and to production.
ModelOps for managing the lifecycle of a traditional machine learning model, including evaluation and retraining.
MLOps includes not just the routine deployment of machine learning models but also the continuous retraining, automated updating, and synchronized development and deployment of more complex machine learning models. Explore these resources for
more details on developing an MLOps strategy:
watsonx.ai use case describes how to develop and deploy machine learning models and generative AI solutions.
watsonx.governance use case describes how to monitor, maintain, automate, and govern machine learning and generative AI models in production.
ModelOps tools
Copy link to section
Depending on the platform you are using and the services you have enabled, you can design your ModelOps process using a combination of tools to help you manage assets.
Pipelines for automating the end-to-end flow of a machine learning model through the AI lifecycle.
AI Governance for creating a centralized repository of factsheets that track the lifecycle of a model, including request, building, deployment, and evaluation of aAI assets
The cpdctl command-line interface tool for managing and automating your machine learning assets that are hosted on Cloud Pak for Data as a Service by using the cpdctl command-line interface tool. Use automatic configuration
from IBM Cloud to easily connect with the cpdctl API commands.
Managing access with deployment spaces
Copy link to section
Use deployment spaces to organize and manage access to assets as they move through the AI lifecycle. For example, you can manage access with deployment spaces in the following ways:
Create a deployment space and assign it to Development as the deployment stage. If you are governing assets, deployments in this type of space display in the Develop stage of a use case. Assign access to the data
scientists to create the assets or DevOps users to create deployments.
Create a deployment space and assign it to Testing as the deployment stage. If you are governing assets, deployments in this type of space display in the Validate stage of a use case. Assign access to the model
validators to test the deployments.
Create a deployment space and assign it to Production as the deployment stage. If you are governing assets, deployments in this type of space display in the Operate stage of a use case. Limit access to this space
to ModelOps users who manage the assets that are deployed to a production environment.
Automating ModelOps by using Pipelines
Copy link to section
The IBM Orchestration Pipelines editor provides a graphical interface for orchestrating an end-to-end flow of assets from creation through deployment. Assemble and configure a pipeline to create, train, deploy, and update machine learning
models and Python scripts. Make your ModelOps process simpler and repeatable.
Tracking models with AI Factsheets
Copy link to section
AI Factsheets provides the capabilities for you to track data science models across the organization and store the details in a catalog. View at a glance which models are in production and which need development or validation. Use the governance
features to establish processes to manage the communication flow from data scientists to ModelOps administrators.
Note: A model inventory tracks only the models that you explicitly track by associating them with model use cases. You can control which models to track for an organization without tracking samples and other
models that are not significant to the organization.
Evaluating model deployments
Copy link to section
Use Watson OpenScale to analyze your AI with trust and transparency and understand how your AI models are involved in decision making. Detect and mitigate bias and drift. Increase the quality and accuracy of your predictions. Explain transactions
and perform what-if analysis.
Automate managing assets and lifecycle
Copy link to section
You can automate the AI Lifecycle in a notebook by using the watsonx.ai Runtime Python client.
Download an externally trained scikit-learn model with data set
Persist an external model in the watsonx.ai Runtime repository
Deploy a model for online scoring by using the client library
Score sample records by using the client library
Update a previously persisted model
Redeploy a model in-place
Scale a deployment
Alternatively, you can use IBM Cloud Pak for Data Command-Line Interface (cpd-cli) to manage configuration settings and automate an end-to-end flow. This end-to-end flow includes training a model, saving it, creating a deployment space, and
deploying the model.
Typical ModelOps scenario
Copy link to section
A typical ModelOps scenario in Cloud Pak for Data might be:
Organize and curate data assets
Train a model by using AutoAI
Save and deploy the model
Track the model in a use case so that all collaborators can track the progress of the model through the lifecycle and make sure that it complies with organizational standards.
Evaluate the deployment for bias
Update the deployment with a better-performing model
Monitor deployments and jobs across the organization
About cookies on this siteOur websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising.For more information, please review your cookie preferences options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.