[tabby title=»Performed»]

Data quality assessments are performed and results are documented.
Example Work Products

  • Data quality rules
  • Data quality assessment results

[tabby title=»Managed»]

2.1 Data quality assessment objectives, targets, and thresholds are established, used, and maintained according to standard techniques and processes.

2.2 Data governance determines the key set of attributes by subject area for
data quality assessments.
Defining the set of key attributes by subject area is essential to managing the quality of shared data assets for the organization. Data governance approves the key attributes and ensures that the value expected will exceed the costs associated with managing and assessing those attributes; it also establishes stewardship of terms and definitions, including how data quality is defined and measured to ensure consistency across the organization. This function also includes the selection of data quality dimensions against which the data will be assessed.
Refer to Data Quality Strategy for additional information related to data quality dimensions.

2.3 Data quality assessments are conducted periodically according to an approved frequency per the data quality assessment policy.
Within data quality assessments, the data quality is evaluated from a business unit perspective for consistency with defined data quality dimensions and criteria.
Refer to Data Profiling for additional information related to practices associated with determination of data quality.
Refer to Data Quality Strategy for additional information related to data quality dimensions and prioritization.

2.4 Data quality assessment results include recommendations for remediation with supporting rationale.
Business stakeholders prioritize data quality improvements and evaluate, test, and approve data quality metrics.
Supporting rationale typically includes root cause analysis and impact assessments, which help to determine corrective action options and support prioritization decisions for data quality improvements.

2.5 Impact analysis includes estimates of the costs of fixes, the level of effort, characterization of the business impact, and tangible and intangible benefits.
The purpose of impact analysis is to support decisions about remediating data quality defects. Because cost control is important, impact analysis is essential.
It is useful to assign a severity characterization to business impacts, estimating the effect of the defect on a business process (for example, high, medium, low) with a brief rationale. Technical impacts, which address the di culty, time, or e ort of remediating the defect, can also be estimated and characterized as high, medium, and low.
High-level information in data quality assessment reports is traceable to component individual records to support analysis.
Drill-down capability to actual values is recommended to facilitate under- standing of data and isolate the source(s) of defects.
Example Work Products

  • Documentation of objectives, targets, and thresholds
  • Documented data quality dimensions and attributes
  • Documented metrics for data quality assessment
  • Documented analysis of business and technical impacts
  • Level of e ort estimates for data quality improvements
  • Evidence that business stakeholders review data quality assessments
  • Recommendations for remediation

[tabby title=»Defined»]
Level 3: Defined
3.1 Periodic data quality assessments are conducted in accordance with data quality policies on an approved schedule, or according to specified event triggers.
Data quality assessment policies should define how methodologies, processes, and data quality rules are to be standardized, communicated, and consistently applied across the organization. The criteria that drive assess- ments should also be specified.
3.2 The methods for assessing business impacts, including costs and risks, are defined, approved, and consistently applied across the organization.
When assessing impacts, data quality professionals should ensure that techniques are applied to determine root causes of any data quality issues. This may require analysis outside of the data set scope.
3.3 Improvement plans resulting from data quality assessments are integrated at the organization level.
Improvement initiatives resulting from assessment activities, other than data cleansing, should address the root cause of the quality problem. This is likely to impact multiple stakeholders and consuming data stores. It is important
to ensure that improvement initiatives are evaluated and planned at an organizational level. In addition, dependencies of a subject area data set upon another data set should be identified and orchestrated to minimize rework. Senior management prioritization of these activities should be driven based on the impact assessments provided with the quality assessment reports.
3.4 Data quality is assessed using established thresholds and targets for each selected quality dimension.

When performing assessments, it is important to provide a clear method of communicating importance of the findings, and to show relative impacts to the organization. Results should be organized by the standard approved data quality dimensions. Scores against defined thresholds and targets will provide clear indicators of where quality is satisfactory or deficient.
Refer to Data Quality Strategy for information on best practices for devel- oping data quality dimensions.

3.5 Data quality measurement reporting standards are integrated into the systems development lifecycle and compliance processes.
Example Work Products

  • Documented scores, targets, and thresholds for each standard data quality dimension
  • Published and accessible organization-level data quality rules for approved attributes.
  • An organization-level data quality assessment policy
  • Standard organization-level data quality assessment processes

[tabby title=»Measured»]

4.1 ata quality measurement reports are systematically generated based on criticality of attributes and data volatility.
Measurement reports are developed according to standard mechanisms such as dashboards, scorecards, etc.
4.2 Data quality operational metadata is standardized, captured, and analyzed using statistical and other quantitative techniques to guide improvements.
Refer to Metadata Management for additional information related to metadata management and its relationship to data quality.
Example Work Products

  • Audit and control reports
  • Data quality assessment progress reports for improvements
  • Data quality confidence surveys (e.g., determining the level of user trust for a given data set).

[tabby title=»Optimized»]
5.1 The organization can quantitatively assess the benefits of proposed data changes and refine management priorities in line with data quality gover- nance practices.
5.2 Data quality assessment and reporting processes are continuously reviewed and improved.
Example Work Products

  • Assessment analysis results
  • Process review documentation
  • Process improvement proposals and approvals

[tabbyending]