Last updated: Nov 27, 2024
You can define your own transformers, estimators, functions, classes, and tensor operations in models that you deploy in IBM watsonx.ai Runtime as online deployments.
Defining and using custom components
To use custom components with your models, you need to package your custom components in a Python distribution package.
Package requirements
- The package type must be: source distribution (distributions of type Wheel and Egg are not supported)
- The package file format must be:
.zip
- Any third-party dependencies for your custom components must be installable by
pip
and must be passed to theinstall_requires
argument of thesetup
function of thesetuptools
library.
Refer to: Creating a source distribution
Storing your custom package
You must take extra steps when you store your trained model in the watsonx.ai Runtime repository:
- Store your custom package in the watsonx.ai Runtime repository (use the
runtimes.store_library
function from the watsonx.ai Python client, or thestore libraries
watsonx.ai Runtime CLI command.) - Create a runtime resource object that references your stored custom package, and then store the runtime resource object in the watsonx.ai Runtime repository (use the
runtimes.store
function, or thestore runtimes
command.) - When you store your trained model in the watsonx.ai Runtime repository, reference your stored runtime resource in the metadata that is passed to the
store_model
function (or thestore
command.)
Supported frameworks
These frameworks support custom components:
- Scikit-learn
- XGBoost
- Tensorflow
- Python Functions
- Python Scripts
- Decision Optimization
For more information, see Supported frameworks
Parent topic: Customizing deployment runtimes