0 / 0
Managing data quality

Managing data quality

Measure, monitor, and maintain the quality of your data to ensure the data meets your expectations and standards for specific use cases.

Data of good quality is in a state that usually can be defined as fit for use, defect free, or meeting expectations and requirements. Data quality is measured against the default quality dimensions Accuracy, Completeness, Consistency, Timeliness, Uniqueness, and Validity, and any custom quality dimension.

Data quality analysis provides answers to these questions:

  • How good is the overall quality of a data asset?
  • Which of the data assets has the better quality?
  • How did the quality of a data asset change over time?

Requirements and restrictions

For data quality management, the following requirements and restrictions exist.

Data quality tools

You work with these tools:

Required service

Data quality management requires these services:

  • IBM Knowledge Catalog
  • DataStage

Data formats

The following data formats are supported:

  • Tables from relational and nonrelational data sources
  • Tabular: Avro, CSV, Parquet, ORC

For information about supported connectors, see Supported data sources for curation and data quality.

Data size

Data quality management tasks can be performed on data of any size.

Required permissions

Your roles determine which data quality management tasks you can perform:

  • To view data quality definitions and rules, you must have at least the Viewer role in the project.
  • To create, edit, or delete data quality definitions and rules, you must have the Admin or the Editor role in the project. In addition, you must have the Manage data quality assets user permission.
  • To run data quality rules, you must have the Admin or the Editor role in the project and the Execute data quality rules user permission.
  • To view the data that caused data quality issues, you must have the Drill down to issue details user permission.


You can perform data quality management tasks in projects. Read-only data quality information is available in catalogs.

Data quality analysis and monitoring

Use data quality analysis and monitoring to evaluate data against specific criteria. Use these evaluation criteria repeatedly over time to see important changes in the quality of the data being validated.

After a data quality check is designed, you have these options:

  • Create a data quality definition that defines the logic of the data check irrespective of the data source. The definition contains logical variables or references that you link or bind to actual data (for example, data source, table and column or joined tables) when you create a data quality rule that can be executed.

    After you create a data quality rule with the required bindings based on a select data quality definition, that rule can be executed. The rule produces relevant statistics and can generate an output table, depending on the rule configuration.

  • Create an SQL-based data quality rule.

The functionality of a data quality rule can range from a simple single column test to evaluating multiple columns within and across data sources.

Assessing data quality

To determine whether your data is of good quality, check in how far the data meets your expectations and identify anomalies in the data. Evaluating your data for quality also helps you to understand the structure and content of your data.

Learn more

Parent topic: Preparing data

Generative AI search and answer
These answers are generated by a large language model in watsonx.ai based on content from the product documentation. Learn more