image

Use watsonx.ai Python SDK to manage Prompt Template assets and create deployment¶

Disclaimers¶

  • Use only Projects and Spaces that are available in watsonx context.

Notebook content¶

This notebook contains the steps and code to demonstrate support for Prompt Template inference and their deployments.

Some familiarity with Python is helpful. This notebook uses Python 3.11.

Learning goal¶

The goal of this notebook is to demonstrate how to create a Prompt Template asset and deployment pointing on it. In general, a Prompt Template is a pattern for generating prompts for language models. A template may contain instruction, input/output prefixes, few-shot examples and appropriate context that may vary depending on different tasks.

Contents¶

This notebook contains the following parts:

  • Setup
  • Prompt Template Inference
  • Prompt Template Deployment
  • Summary

Set up the environment¶

Before you use the sample code in this notebook, you must perform the following setup tasks:

  • Create a watsonx.ai Runtime Service instance (a free plan is offered and information about how to create the instance can be found here).

Install dependencies¶

Note: ibm-watsonx-ai documentation can be found here.

In [1]:
%pip install -U ibm-watsonx-ai | tail -n 1
%pip install -U "langchain<1.0" | tail -n 1
Successfully installed anyio-4.11.0 cachetools-6.2.2 certifi-2025.11.12 charset_normalizer-3.4.4 h11-0.16.0 httpcore-1.0.9 httpx-0.28.1 ibm-cos-sdk-2.14.3 ibm-cos-sdk-core-2.14.3 ibm-cos-sdk-s3transfer-2.14.3 ibm-watsonx-ai-1.4.6 idna-3.11 jmespath-1.0.1 lomond-0.3.3 numpy-2.3.4 pandas-2.2.3 pytz-2025.2 requests-2.32.5 sniffio-1.3.1 tabulate-0.9.0 tzdata-2025.2 urllib3-2.5.0
Successfully installed PyYAML-6.0.3 SQLAlchemy-2.0.44 annotated-types-0.7.0 jsonpatch-1.33 jsonpointer-3.0.0 langchain-0.3.27 langchain-core-0.3.79 langchain-text-splitters-0.3.11 langsmith-0.4.42 orjson-3.11.4 pydantic-2.12.4 pydantic-core-2.41.5 requests-toolbelt-1.0.0 tenacity-9.1.2 typing-inspection-0.4.2 zstandard-0.25.0

Defining the watsonx credentials¶

This cell defines the watsonx credentials required to work with watsonx Prompt Template inferencing.

Action: Provide the IBM Cloud user API key. For details, see documentation.

In [2]:
import getpass

from ibm_watsonx_ai import Credentials

credentials = Credentials(
    url="https://us-south.ml.cloud.ibm.com",
    api_key=getpass.getpass("Please enter your watsonx.ai api key (hit enter): "),
)

Defining the project ID¶

The Prompt Template requires project ID that provides the context for the call. We will obtain the ID from the project in which this notebook runs. Otherwise, please provide the project ID.

In [3]:
import os

try:
    project_id = os.environ["PROJECT_ID"]
except KeyError:
    project_id = input("Please enter your project_id (hit enter): ")

API Client initialization

In [4]:
from ibm_watsonx_ai import APIClient

client = APIClient(credentials, project_id=project_id)

Prompt Template on watsonx.ai¶

In [5]:
from ibm_watsonx_ai.foundation_models.prompts import (
    PromptTemplate,
    PromptTemplateManager,
)
from ibm_watsonx_ai.foundation_models.utils.enums import (
    DecodingMethods,
    PromptTemplateFormats,
)
from ibm_watsonx_ai.metanames import GenTextParamsMetaNames as GenParams

Instantiate PromptTemplateManager¶

In [6]:
prompt_mgr = PromptTemplateManager(api_client=client)

Create a Prompt Template object and store it in the project¶

We use a PromptTemplate object to store the properties of a newly created prompt template. Prompt text is composed of the instruction, input/output prefixes, few-shot examples and input text. All of the mentioned fields may contain placeholders ({...}) with input variables and for the template to be valid, these input variables must be also specified in input_variables parameter.

In [7]:
prompt_template = PromptTemplate(
    name="New prompt",
    model_id=client.foundation_models.TextModels.GRANITE_4_H_SMALL,
    model_params={GenParams.DECODING_METHOD: "sample"},
    description="My example",
    task_ids=["generation"],
    input_variables=["object"],
    instruction="Answer on the following question",
    input_prefix="Human",
    output_prefix="Assistant",
    input_text="What is {object} and how does it work?",
    examples=[
        [
            "What is a loan and how does it work?",
            "A loan is a debt that is repaid with interest over time.",
        ]
    ],
)

Using store_prompt(prompt_template_id) method, one can store newly created prompt template in your project.

In [8]:
stored_prompt_template = prompt_mgr.store_prompt(prompt_template=prompt_template)
In [9]:
print(f"Asset id: {stored_prompt_template.prompt_id}")
print(f"Is it a template?: {stored_prompt_template.is_template}")
Asset id: 81fb51ea-4042-4a81-812c-93cf1e41f72c
Is it a template?: True

We can also store a template which is a langchain Prompt Template object.

In [10]:
from langchain_core.prompts import PromptTemplate as LcPromptTemplate

langchain_prompt_template = LcPromptTemplate(
    template="What is {object} and how does it work?",
    input_variables=["object"],
    metadata={"name": "LangChain prompt template"},
)
stored_prompt_template_lc = prompt_mgr.store_prompt(
    prompt_template=langchain_prompt_template
)
print(f"Asset id: {stored_prompt_template_lc.prompt_id}")
Asset id: 7b9720ef-07f8-43b9-88e1-8f51a6f44b99

Manage Prompt Templates¶

In [11]:
prompt_mgr.list()
Out[11]:
ID NAME CREATED LAST MODIFIED
0 7b9720ef-07f8-43b9-88e1-8f51a6f44b99 LangChain prompt template 2025-11-14T14:34:34Z 2025-11-14T14:34:36Z
1 81fb51ea-4042-4a81-812c-93cf1e41f72c New prompt 2025-11-14T14:34:29Z 2025-11-14T14:34:31Z

To retrive Prompt Template asset from project and return string that contains Prompt Template input we use load_prompt(prompt_template_id, astype=...). Returned input string is composed of the fields: instruction, input_prefix, output_prefix, examples and input_text. After substituting prompt variables, one can evaluate a language model on the obtained string.

In [12]:
prompt_text = prompt_mgr.load_prompt(
    prompt_id=stored_prompt_template.prompt_id, astype=PromptTemplateFormats.STRING
)
print(prompt_text)
Answer on the following question

Human What is a loan and how does it work?
Assistant A loan is a debt that is repaid with interest over time.

Human What is {object} and how does it work?
Assistant

To update Prompt Template data use prompt_mgr.update_prompt method.

In [13]:
prompt_with_new_instruction = PromptTemplate(
    instruction="Answer on the following question in a concise way."
)
prompt_mgr.update_prompt(
    prompt_id=stored_prompt_template.prompt_id,
    prompt_template=prompt_with_new_instruction,
)
prompt_text = prompt_mgr.load_prompt(
    prompt_id=stored_prompt_template.prompt_id, astype=PromptTemplateFormats.STRING
)
print(prompt_text)
Answer on the following question in a concise way.

Human What is a loan and how does it work?
Assistant A loan is a debt that is repaid with interest over time.

Human What is {object} and how does it work?
Assistant

Furthermore, to get information about locked state of Prompt Template run following method

In [14]:
prompt_mgr.get_lock(prompt_id=stored_prompt_template.prompt_id)
Out[14]:
{'locked': True, 'locked_by': 'IBMid-696000GJGB', 'lock_type': 'edit'}

Using lock or unlock method, one can change locked state of Prompt Template asset.

In [15]:
prompt_mgr.unlock(prompt_id=stored_prompt_template_lc.prompt_id)
Out[15]:
{'locked': False}

Once the prompt template is unlocked it can be deleted. You can also use the list method to check the available prompt templates (passing limit=2 parameter will return only 2 recently created templates).

In [16]:
print(
    f"ID of the Prompt Template asset that will be deleted: {stored_prompt_template_lc.prompt_id}"
)
prompt_mgr.delete_prompt(prompt_id=stored_prompt_template_lc.prompt_id)
prompt_mgr.list(limit=2)
ID of the Prompt Template asset that will be deleted: 7b9720ef-07f8-43b9-88e1-8f51a6f44b99
Out[16]:
ID NAME CREATED LAST MODIFIED
0 81fb51ea-4042-4a81-812c-93cf1e41f72c New prompt 2025-11-14T14:34:29Z 2025-11-14T14:34:41Z

Deployment pointing to Prompt Template¶

In the deployment example we will use the prompt with the following input

In [17]:
prompt_input_text = prompt_mgr.load_prompt(
    prompt_id=stored_prompt_template.prompt_id, astype=PromptTemplateFormats.STRING
)
print(prompt_input_text)
Answer on the following question in a concise way.

Human What is a loan and how does it work?
Assistant A loan is a debt that is repaid with interest over time.

Human What is {object} and how does it work?
Assistant

Now, we create deployment providing the ID of Prompt Template asset and meta props.

In [18]:
meta_props = {
    client.deployments.ConfigurationMetaNames.NAME: "SAMPLE DEPLOYMENT PROMPT TEMPLATE",
    client.deployments.ConfigurationMetaNames.ONLINE: {},
    client.deployments.ConfigurationMetaNames.BASE_MODEL_ID: "ibm/granite-4-h-small",
}

deployment_details = client.deployments.create(
    artifact_id=stored_prompt_template.prompt_id, meta_props=meta_props
)

######################################################################################

Synchronous deployment creation for id: '81fb51ea-4042-4a81-812c-93cf1e41f72c' started

######################################################################################


initializing
Note: online_url and serving_urls are deprecated and will be removed in a future release. Use inference instead.

ready


-----------------------------------------------------------------------------------------------
Successfully finished deployment creation, deployment_id='4b02d850-f3a0-4de4-8ad5-6613e97f0371'
-----------------------------------------------------------------------------------------------


In [19]:
client.deployments.list()
Out[19]:
ID NAME STATE CREATED ARTIFACT_TYPE SPEC_STATE SPEC_REPLACEMENT
0 4b02d850-f3a0-4de4-8ad5-6613e97f0371 SAMPLE DEPLOYMENT PROMPT TEMPLATE ready 2025-11-14T14:34:53.192Z base_foundation_model not_provided

Substitute prompt variables and generate text

In [20]:
deployment_id = client.deployments.get_id(deployment_details)
In [21]:
client.deployments.generate_text(
    deployment_id,
    params={
        "prompt_variables": {"object": "a mortgage"},
        GenParams.TEMPERATURE: 0,
        GenParams.STOP_SEQUENCES: ["\n\n"],
        GenParams.MAX_NEW_TOKENS: 50,
    },
)
Out[21]:
' A mortgage is a loan used to purchase real estate, with the property serving as collateral for the loan.\n\n'

Generate text using ModelInference¶

You can also generate text based on your Prompt Template deployment using ModelInference class.

In [22]:
from ibm_watsonx_ai.foundation_models import ModelInference
In [23]:
model_inference = ModelInference(
    deployment_id=deployment_id, credentials=credentials, project_id=project_id
)
In [24]:
model_inference.generate_text(
    params={
        "prompt_variables": {"object": "a mortgage"},
        GenParams.DECODING_METHOD: DecodingMethods.GREEDY,
        GenParams.STOP_SEQUENCES: ["\n\n"],
        GenParams.MAX_NEW_TOKENS: 50,
    }
)
Out[24]:
' A mortgage is a loan used to purchase real estate, repaid over a long period with interest.\n\n'

Summary and next steps¶

You successfully completed this notebook!

You learned how to create valid Prompt Template and store it in watsonx.ai project. Furthermore, you also learned how to create deployment pointing to a Prompt Template asset and generate text using base model.

Check out our Online Documentation for more samples, tutorials, documentation, how-tos, and blog posts.

Authors¶

Mateusz Świtała, Software Engineer at watsonx.ai.

Copyright © 2023-2026 IBM. This notebook and its source code are released under the terms of the MIT License.