0 / 0
Foundation model lifecycle
Last updated: Dec 05, 2024
Foundation model lifecycle

To help you discover and use the latest and best foundation models from IBM and other open source or third-party providers, the list of foundation models for prompting in watsonx.ai are refreshed regularly.

Foundation models that are built by IBM are continuously updated and improved. As new versions of IBM foundation models are introduced, older versions remain available for you to use for at least 90 days after an updated model is introduced.

Similarly, as newer and more-effective models from other providers become available, less useful models are removed from watsonx.ai to make room for them. You are given at least 30 days notice before foundation models from other providers are removed from watsonx.ai.

Modifications to IBM foundation models

IBM foundation models are periodically modified by IBM to improve the foundation model performance or security. A modification is a model refresh that might include new capabilities or fixes, but does not meet IBM's criteria to warrant a version update.

Foundation model modifications do not disrupt the watsonx.ai service. You can check the current full version number for an IBM foundation model at any time from the model card. The version number consists of three digits that identify the version, modification, and fix numbers that are associated with the IBM foundation model. For more information about versioning, see IBM Software product versioning explained.

Any applications that inference an IBM foundation model that is modified will pick up the modifications, including any changes in performance or in the output that is generated by the model.

Foundation model deprecation

During the deprecation period, you can continue to inference the deprecated foundation model. However, a message is returned with the foundation model output to notify you about the upcoming model removal.

A deprecated foundation model can also be constricted. When a deprecated model is in the constricted state, it means the model can be inferenced, but cannot be tuned, trained, or deployed.

When a foundation model is deprecated, the following steps are taken to inform you about the deprecation:

  • The foundation model is highlighted in the product user interface with a warning icon Warning icon. A tooltip indicates that the deprecated model is scheduled for withdrawal.
  • The deprecation is announced in the What’s new topic of the product documentation. The release note clearly states the deprecation date and withdrawal date for the foundation model.
  • The Deprecated foundation models table is updated to show the foundation model that is being deprecated, the dates of deprecation and withdrawal, and a suitable alternative foundation model for you to consider as a replacement.

Deprecated models

The following table lists the foundation models that are deprecated.

Table 1: Deprecated foundation models
Foundation model name API model ID Deprecation date Withdrawal date Alternative foundation model
llama-3-8b-instruct meta-llama/llama-3-8b-instruct 2 December 2024 3 February 2025 llama-3-1-8b-instruct, llama-3-2-11b-vision-instruct
llama-3-70b-instruct meta-llama/llama-3-70b-instruct 2 December 2024 • 3 February 2025 (Dallas, Frankfurt, London, and Tokyo data centers)
• 31 March 2025 (Sydney data center)
llama-3-1-70b-instruct, llama-3-2-90b-vision-instruct
granite-13b-chat-v2 ibm/granite-13b-chat-v2 4 November 2024 3 February 2025 granite-3-8b-instruct
granite-7b-lab ibm/granite-7b-lab 7 October 2024 7 January 2025 granite-3-8b-instruct
llama-2-13b-chat meta-llama/llama-2-13b-chat 26 August 2024 llama-3-1-8b-instruct

Planned deprecations

The following table shows the models that are planned for upcoming deprecation. The exact deprecation dates, withdrawal dates, and other details might change.

Table 2: Upcoming foundation model deprecations
Foundation model name Planned deprecation date Planned withdrawal date Alternative foundation models

There are no models that are planned for deprecation currently.

Withdrawn models

The following table lists the foundation models that were previously available from watsonx.ai, but are now withrawn.

Table 3: Withdrawn foundation models
Foundation model name API model ID Deprecation date Withdrawal date Alternative foundation model
granite-13b-chat-v1 ibm/granite-13b-chat-v1 11 January 2024 11 April 2024 granite-13b-chat-v2
granite-13b-instruct-v1 ibm/granite-13b-instruct-v1 11 January 2024 11 April 2024 granite-13b-instruct-v2
gpt-neox-20b eleutherai/gpt-neox-20b 15 February 2024 21 March 2024 mixtral-8x7b-instruct-v01-q
mpt-7b-instruct2 ibm/mpt-7b-instruct2 15 February 2024 21 March 2024 mixtral-8x7b-instruct-v01-q
starcoder-15.5b bigcode/starcoder 15 February 2024 25 April 2024 codellama-34b-instruct
merlinite-7b ibm-mistralai/merlinite-7b 22 July 2024 22 August 2024 mixtral-8x7b-instruct-v01
mixtral-8x7b-instruct-v01-q ibm-mistralai/mixtral-8x7b-instruct-v01-q 19 April 2024 30 August 2024 mixtral-8x7b-instruct-v01
llama-2-70b-chat meta-llama/llama-2-70b-chat 26 August 2024 25 September 2024 llama-3.1-70b-instruct
llama3-llava-next-8b-hf meta-llama/llama3-llava-next-8b-hf 7 October 2024 7 November 2024 llama-3-2-11b-vision-instruct
llama2-13b-dpo-v7 mnci/llama2-13b-dpo-v7 4 November 2024 4 December 2024 llama-3-1-8b-instruct
mt0-xxl-13b bigscience/mt0-xxl 4 November 2024 4 December 2024 llama-3-1-8b-instruct, llama-3-2-11b-vision-instruct

What to do next

You must choose an alternative supported foundation model to use if any of the following saved resources submit input to a foundation model that is withdrawn:

  • Prompt template asset
  • Prompt session asset
  • Notebook asset

For more information about working with saved prompt assets, see Saving your work.

For more information about how to change the foundation model that is inferenced from a notebook asset, see Inferencing a foundation model with a notebook.

Parent topic: Supported foundation models

Generative AI search and answer
These answers are generated by a large language model in watsonx.ai based on content from the product documentation. Learn more