Last updated: Nov 27, 2024
You can define your own transformers, estimators, functions, classes, and tensor operations in models that you deploy in IBM watsonx.ai Runtime as online deployments.
Defining and using custom components
To use custom components with your models, you need to package your custom components in a Python distribution package.
Package requirements
- The package type must be: source distribution (distributions of type Wheel and Egg are not supported)
- The package file format must be:
.zip
- Any third-party dependencies for your custom components must be installable by
and must be passed to thepip
argument of theinstall_requires
function of thesetup
library.setuptools
Refer to: Creating a source distribution
Storing your custom package
You must take extra steps when you store your trained model in the watsonx.ai Runtime repository:
- Store your custom package in the watsonx.ai Runtime repository (use the
function from the watsonx.ai Python client, or theruntimes.store_library
watsonx.ai Runtime CLI command.)store libraries
- Create a runtime resource object that references your stored custom package, and then store the runtime resource object in the watsonx.ai Runtime repository (use the
function, or theruntimes.store
command.)store runtimes
- When you store your trained model in the watsonx.ai Runtime repository, reference your stored runtime resource in the metadata that is passed to the
function (or thestore_model
command.)store
Supported frameworks
These frameworks support custom components:
- Scikit-learn
- XGBoost
- Tensorflow
- Python Functions
- Python Scripts
- Decision Optimization
For more information, see Supported frameworks
Parent topic: Customizing deployment runtimes
Was the topic helpful?
0/1000