0 / 0
Inaccessible training data risk for AI

Inaccessible training data risk for AI

Risks associated with output
Explainability
Amplified by generative AI

Description

Without access to the training data, the types of explanations a model can provide are limited and more likely to be incorrect.

Why is inaccessible training data a concern for foundation models?

Low quality explanations without source data make it difficult for users, model validators, and auditors to understand and trust the model.

Parent topic: AI risk atlas

Generative AI search and answer
These answers are generated by a large language model in watsonx.ai based on content from the product documentation. Learn more