0 / 0
Python library

Python library

You can inference and tune foundation models in IBM watsonx.ai programmatically by using the Python library.

See Foundation models Python library.

You can also work with watsonx.ai foundation models from third-party tools, including:

Learn from available sample notebooks

Sample notebooks are available that you can use as a guide as you create notebooks of your own to do common tasks such as inferencing or tuning a foundation model.

To find available notebooks, search the Resource hub. You can add notebooks that you open from the Resource hub to your project, and then run them.

You can also access notebooks from the Python sample notebooks GitHub repository.

Using the Python library from your IDE

The ibm-watsonx-ai Python library is available on PyPI from the url: https://pypi.org/project/ibm-watsonx-ai/.

You can install the ibm-watsonx-ai Python library in your integrated development environment by using the following command:

pip install ibm-watsonx-ai

If you already have the library installed, include the -U parameter to pick up any updates and work with the latest version of the library.

pip install -U ibm-watsonx-ai

Working with LangChain from a notebook

LangChain is a framework that developers can use to create applications that incorporate large language models. LangChain can be useful when you want to link two or more functions together. For example, you can use LangChain as part of a retrieval-augmented generation (RAG) task.

For more information, see LLMs > IBM watsonx.ai

Use one of the sample RAG notebooks that leverages LangChain to learn more. See RAG examples.

From the Use watsonx.ai and LangChain Agents to perform sequence of actions notebook, you can use agents to do a sequence of actions based on foundation model responses.

Working with LlamaIndex functions from a notebook

LlamaIndex is a framework for building large language model applications. You can leverage functions available from LlamaIndex, such as text-to-SQL or Pandas DataFrames capabilities in applications that you build with watsonx.ai foundation models.

For more information, see LLMs > IBM watsonx.ai.

You can work with LlamaIndex functions from a notebook in watsonx.ai.

For example, use the Use watsonx, and LlamaIndex for Text-to-SQL task notebook to convert natural language queries into SQL queries.

Prerequisites

To get started with the Python library, you first need credentials and a project ID or deployment ID. For more information, see the following topics:

Learn more

Parent topic: Coding generative AI solutions

Generative AI search and answer
These answers are generated by a large language model in watsonx.ai based on content from the product documentation. Learn more