Using custom components in a TensorFlow model

This topic demonstrates how to use custom operators with tf.py_func in a TensorFlow model that you deploy in IBM Watson Machine Learning as an online deployment.

 

Steps

  1. Package components
  2. Store your custom package and model

After the trained model is stored with the required custom package and runtime details, you can deploy models that use custom components as online deployments the same way you do for models that don’t use custom components.

 

Step 1: Package components

To use custom components in your model, you need to package your custom components in a Python distribution package.

See: Requirements for using custom components in your models

TensorFlow-specific package requirement: initialize_py_func()

For TensorFlow models, your custom package must contain a function named initialize_py_func():

  • initialize_py_func() defines the tf.py_func operations.
  • initialize_py_func() must be referenceable using the top-level module name and the dot operator. For example, if the top-level module in the custom package is named my_top_module, then initialize_py_func() must be referenceable as my_top_module.initialize_py_func().

Sample

A sample custom package for a TensorFlow model can be found here: custom_reshape_pyfunc-0.1.zip external link.

See also: tf.py_func external link

 

Step 2: Store your custom package and model

After the model is trained, you must take extra steps when storing a model that uses custom components in the Watson Machine Learning repository:

  1. Store your custom package
  2. Create and store a runtime resource object
  3. Reference your stored runtime resource object when storing the model

See: Storing your custom package

 

Example 1: Using the Watson Machine Learning Python client

This example demonstrates using the Watson Machine Learning Python client external link to store a model that uses custom operators.

  1. Store your custom package in the Watson Machine Learning repository:

    pkg_meta = {
        client.runtimes.LibraryMetaNames.NAME        : "thepyfuncpackage",
        client.runtimes.LibraryMetaNames.DESCRIPTION : "A custom pyfunc lib which reshapes input",
        client.runtimes.LibraryMetaNames.FILEPATH    : "thepyfuncpackage-0.1.zip",
        client.runtimes.LibraryMetaNames.VERSION     : "1.0",
        client.runtimes.LibraryMetaNames.PLATFORM    : { "name": "python", "versions": ["3.6"] }
    }
    custom_package_details = client.runtimes.store_library( pkg_meta )
    custom_package_uid = client.runtimes.get_library_uid( custom_package_details )
    

    Notes:

    • For client.runtimes.LibraryMetaNames.NAME, specify the value passed in the name parameter of the setup() function in the setup.py file.
    • For client.runtimes.LibraryMetaNames.FILEPATH, specify the .zip file name of your custom package.
    • You need the identifier of the stored package, custom_package_uid, for the next step. For the sake of example, assume the stored package identifier is d546dbfa-d85f-419c-b4d2-750831a22b4f.
  2. Create a runtime resource object that references your stored custom package, and then store the runtime resource object in the Watson Machine Learning repository:

    runtime_meta = {
        client.runtimes.ConfigurationMetaNames.NAME        : "runtime_mnist",
        client.runtimes.ConfigurationMetaNames.DESCRIPTION : "runtime spec - mnist",
        client.runtimes.ConfigurationMetaNames.PLATFORM : {
            "name"    : "python",
            "version" : "3.6"
        },
        client.runtimes.ConfigurationMetaNames.LIBRARIES_UIDS: ["https://us-south.ml.cloud.ibm.com/v4/libraries/d546dbfa-d85f-419c-b4d2-750831a22b4f"]
    }
    runtime_details    = client.runtimes.store( runtime_meta )
    custom_runtime_uid = client.runtimes.get_uid( runtime_details )
    

    Notes:

    • For the sake of example, assume d546dbfa-d85f-419c-b4d2-750831a22b4f is the stored package identifier custom_package_uid from the previous step.
    • You need the identifier of the runtime resource object, custom_runtime_uid, for the next step. For the sake of example, assume the stored runtime resource identifier is a8cc02db-4a9b-4628-bf79-da5372e3f63b.
  3. Store your trained model in the Watson Machine Learning repository, referencing your stored runtime resource in meta data:

    model_meta = {
        client.repository.ModelMetaNames.AUTHOR_NAME       : "IBM",
        client.repository.ModelMetaNames.AUTHOR_EMAIL      : "ibm@ibm.com",
        client.repository.ModelMetaNames.NAME              : "cust_pyfunc_mnist",
        client.repository.ModelMetaNames.DESCRIPTION       : "cust MNIST with pyfunc",
        client.repository.ModelMetaNames.RUNTIME_UID       : a8cc02db-4a9b-4628-bf79-da5372e3f63b,
        client.repository.ModelMetaNames.FRAMEWORK_NAME    : "tensorflow",
        client.repository.ModelMetaNames.FRAMEWORK_VERSION : "1.13",
        client.repository.ModelMetaNames.RUNTIME_NAME      : "python",
        client.repository.ModelMetaNames.RUNTIME_VERSION   : "3.6"
    }
    model_details = client.repository.store_model( model=model_path, meta_props=model_meta )
    

    Notes:

    • For the sake of example, assume a8cc02db-4a9b-4628-bf79-da5372e3f63b is the stored runtime resource identifier custom_runtime_uid from the previous step.

 

Example 2: Using the Watson Machine Learning CLI

This example demonstrates using the Watson Machine Learning command line interface (CLI) to store a model that uses custom operators.

  1. Store your custom package in the Watson Machine Learning repository:

    bx ml store libraries thepyfuncpackage-0.1.zip library.json
    

    Sample library.json:

    {
        "name" : "thepyfuncpackage",
        "description" : "A custom pyfunc lib which reshapes input",
        "platform" : {
            "name"     : "python",
            "versions" : ["3.6"]
        },
        "version" : "1.0"
    }
    

    Sample output:

    Created Library with ID 'd546dbfa-d85f-419c-b4d2-750831a22b4f'
    

    Notes:

    • For the name key in library.json, specify the value passed in the name parameter of the setup() function in the setup.py file.
    • You need the identifier of the stored package, d546dbfa-d85f-419c-b4d2-750831a22b4f, for the next step.
  2. Create a runtime resource object that references your stored custom package, and then store the runtime resource object in the Watson Machine Learning repository:

    bx ml store runtimes runtimes.json
    

    Sample runtimes.json:

    {
        "name" : "runtime_mnist",
        "description" : "runtime spec - mnist",
        "platform" : {
            "name"    : "python",
            "version" : "3.6"
        },  
        "custom_libraries": {
            "urls": [ "https://us-south.ml.cloud.ibm.com/v4/libraries/d546dbfa-d85f-419c-b4d2-750831a22b4f" ]
        }
    }
    

    Sample output:

    Runtime created with ID 'a8cc02db-4a9b-4628-bf79-da5372e3f63b'
    

    Notes:

    • For the sake of example, assume d546dbfa-d85f-419c-b4d2-750831a22b4f is the stored package identifier in the output from the previous step. For runtimes.json, you can look up the url for the custom package using the command bx ml show libraries d546dbfa-d85f-419c-b4d2-750831a22b4f
    • You need the identifier of the runtime resource object, a8cc02db-4a9b-4628-bf79-da5372e3f63b, for the next step.
  3. Store your trained model in the Watson Machine Learning repository, referencing your stored runtime resource in meta data:

    bx ml store tf_mnist_pyfunc.zip models.json
    

    Sample models.json:

    { 
        "framework" : {
            "name"     : "tensorflow",
            "version"  : "1.13",
            "runtimes" : [
                {   
                    "name"    : "python",
                    "version" : "3.6"
                }   
            ]   
        },  
        "name"        : "cust_pyfunc_mnist",
        "description" : "cust MNIST with pyfunc",
        "runtime_url" : runtimes_url
    }
    

    Sample output:

    Model store successful. Model-ID is '9670732a-2039-45c5-ba13-c7352c0c4965'
    

    Notes:

    • For the sake of example, assume a8cc02db-4a9b-4628-bf79-da5372e3f63b is the stored runtime resource identifier in the output from the previous step. For runtimes_url in models.json, you can look up the url for the runtime resource using the command bx ml show runtimes a8cc02db-4a9b-4628-bf79-da5372e3f63b

 

See also

For a full example, see this sample notebook: CustLib-TF-PyFunc-Model-Save-Deploy-Score.ipynb external link