0 / 0
Model deployment

Model deployment

To deploy a model, create a model ready for deployment in your deployment space and then upload your model as a zip. Once deployed you can submit jobs to your model and monitor job states.

First package your Decision Optimization model with common data (optional) ready for deployment as a tar.gz or .zip file.

This zip file includes the following optional files:
  1. Your model files
  2. Settings (see Solve parameters for more information)
  3. Common data
To create a model ready for deployment, you register it in Watson Machine Learning providing the following information:
  • the Decision Optimization runtime version:
    • do_12.10 runtime is based on CPLEX V.12.10
    • do_12.9 runtime is based on CPLEX V.12.9
  • the model type:
    • opl (do-opl_<runtime version>)
    • cplex (do-cplex_<runtime version>)
    • cpo (do-cpo_<runtime version>)
    • docplex (do-docplex_<runtime version>) using Python V.3.6

    (The Runtime version can be one of the available runtimes so, for example, an opl model with runtime 12.10 would have the model type do-opl_12.10.)

  • the hardware specification for the available configuration sizes (small S, medium M, extra large XL)

and upload the associated model archive if needed.

This Watson Machine Learning model can then be used in one or multiple deployments.

In summary, to deploy your model:

  1. Choose your desired Decision Optimization runtime.
  2. Package your Decision Optimization model with common data (optional) ready for deployment as a tar.gz or .zip file.
  3. Upload your model archive (tar.gz or .zip file) on Watson Machine Learning. See Model execution for information about input file types. You obtain a model-URL.
  4. Deploy your model using the model-URL and obtain a deployment-id.
  5. Monitor the deployment using the deployment-id. Deployment states can be: initializing, updating, ready, or failed.
Generative AI search and answer
These answers are generated by a large language model in watsonx.ai based on content from the product documentation. Learn more