0 / 0
Tracking assets in an AI use case

Tracking assets in an AI use case

Track machine learning models or prompt templates in AI use cases to capture details about them in factsheets. Use the information collected in the AI use case to monitor the progress of assets through the AI lifecycle, from request to production.

Define an AI use case to identify a business problem and request a solution. A solution might be one or more models to address the business problem. When an asset is developed, associate it with the use case to capture details about the asset in factsheets. As the asset moves through the AI lifecycle, from development to testing and then to production, the factsheets collect the data to support governance or compliance goals.

Creating approaches to compare ways to solve a problem

Each AI use case can contain at least one approach. An approach is one facet of the solution to the business problem represented by the AI use case. For example, you might create two approaches to compare by using different frameworks for predictive models to see which one performs best.

Approaches also capture version information. The same version number is applied to all assets in an approach. If you have a stable version of an asset, you might maintain that version in an approach and create a new approach for the next round of iteration and experimentation.

Adding assets to a use case

You can track these assets in an AI use case:

  • Machine learning models that are created by using a watsonx.ai Runtime tool such as AutoAI or SPSS Modeler.
  • External models are models that are created in Jupyter Notebooks or models that are created by using a third-party machine learning provider.

Parent topic: Governing assets in AI use cases

Generative AI search and answer
These answers are generated by a large language model in watsonx.ai based on content from the product documentation. Learn more