Unreliable source attribution risk for AI
Source attribution is the AI system's ability to describe from what training data it generated a portion or all its output. Since current techniques are based on approximations, these attributions might be incorrect.
Why is unreliable source attribution a concern for foundation models?
Low quality explanations make it difficult for users, model validators, and auditors to understand and trust the model.
Parent topic: AI risk atlas