Importing trained Caffe model into Watson Machine Learning

If you have a Caffe model that you trained outside of IBM Watson Machine Learning, this topic describes how to import that model into your Watson Machine Learning service.

 

Restrictions and requirements

  • The saved(serialized) model file must be in the top level of the .tar.gz file that gets saved/uploaded to the Watson Machine Learning repository using client.repository.store_model() API.
  • The only supported deployment types for Caffe models are: web service and batch
  • See also: Supported frameworks

 

Example

The following notebook demonstrates importing a Caffe model:

[NEED A NOTEBOOK FOR CAFFE]

 

Interface options

 

Step 0 for interface options 1 and 2: Build, train, and save a model

[NEED INFO ON FILE TYPE, ETC] The following Python code snippet demonstrates:

  • Building and training a text classifier, model
  • Saving the model in a file called “message-classification-model.h5”
  • Saving the .h5 file in a .tgz file called “message-classification-model.tgz” ```

[NEED CODE]

{:python}
Where:

[NEED NOTEBOOK]

For the full code example, see:

<p>&nbsp;</p>


## Interface option 1: Watson Machine Learning Python client
{: #python}

### Step 1: Store the model in your Watson Machine Learning repository
You can store the model in your Watson Machine Learning repository using the Watson Machine Learning Python client <a href="https://wml-api-pyclient.mybluemix.net/index.html?highlight=store_model#client.Repository.store_model" target="_other"><code>store_model</code> method <img src="../console/images/launch-glyph.png" alt="external link" /></a>.

[ANY SPECIAL NOTES?]



**Example Python code**

from watson_machine_learning_client import WatsonMachineLearningAPIClient client = WatsonMachineLearningAPIClient( ) metadata = { client.repository.ModelMetaNames.NAME: "caffe model", client.repository.ModelMetaNames.FRAMEWORK_NAME: "caffe", client.repository.ModelMetaNames.FRAMEWORK_VERSION: "1.0" client.repository.ModelMetaNames.FRAMEWORK_LIBRARIES: [{'name':'caffe', 'version': '1.0'}] } model_details = client.repository.store_model( model="message-classification-model.tgz", meta_props=metadata )

{:python}
Where:
- &lt;your-credentials> contains credentials for your Watson Machine Learning service (see: <a href="ml-get-wml-credentials.html">Looking up credentials</a>)

### Step 2: Deploy the stored model in your Watson Machine Learning
The following example demonstrates deploying the stored model as a web service, which is the default deployment type:

model_id = model_details[“metadata”][“guid”] model_deployment_details = client.deployments.create( artifact_uid=model_id, name=”My Caffe model deployment” )

{:python}

See:  <a href="https://wml-api-pyclient.mybluemix.net/index.html?highlight=deployment%20create#client.Deployments.create" target="_other"><code>Deployments.create</code> <img src="../console/images/launch-glyph.png" alt="external link" /></a>

<p>&nbsp;</p>


## Interface option 2: Watson Machine Learning CLI
{: #cli}

**Prerequisite**: <a href="ml_dlaas_environment.html">Set up the CLI environment</a>.

### Step 1: Store the model in your Watson Machine Learning repository

**Example command and corresponding output**

ibmcloud ml store Starting to store ... OK Model store successful. Model-ID is '145bca56-134f-7e89-3c12-0d3a7859d21f'. ```

Where:

  • <model-filename> is the path and name of the .tgz file
  • <manifest-filename> is the path and name of a manifest fest containing metadata about the model being stored. Note that metadata that specifies Keras and its version must be provided under the framework property in the manifest file used to store the model.

Sample manifest file contents

name: My Caffe model
framework:
  name: caffe
  version: '1.0'
  libraries: [{"name": "caffe", "version": "1.0"}]

See: store CLI command external link

Step 2: Deploy the stored model in your Watson Machine Learning

The following example demonstrates deploying the stored model as a web service, which is the default deployment type:

Example command and corresponding output

>ibmcloud ml deploy <model-id> "My Caffe model deployment"
Deploying the model with MODEL-ID '145bca56-134f-7e89-3c12-0d3a7859d21f'...
DeploymentId       316a89e2-1234-6472-1390-c5432d16bf73
Scoring endpoint   https://us-south.ml.cloud.ibm.com/v3/wml_instances/5da31...
Name               My Caffe model deployment
Type               caffe-1.0
Runtime            None Provided
Status             DEPLOY_SUCCESS
Created at         2019-01-14T19:47:51.735Z
OK
Deploy model successful

Where:

  • <model-id> was returned in the output from the store command

See: deploy CLI command external link