Data Quality Assessment Manager (DQAM)
Improve data quality and governance with Data quality assessment manager
About Data Quality Assessment Manager (DQAM)
Data Quality Assessment Manager guides a user through a comprehensive value-level assessment of a given data asset, thus enabling an organization to scale its data quality activities appropriately. The DQAM is structured so as to allow an organization’s data quality analysts to competently employ advanced methods within their field. Moreover, by facilitating the collaborative nature of both analysis and sharing of findings both internally and externally, with the broader organizational data community, DQAM also promotes a team dynamic among the organization’s data quality analysts.
- Standardize and Execute a formal Data Quality Assessment methodology
- Easily manage data quality Scores (A,B,C,D and F) and enable data quality governance programs
- Designed for data stewards and business users to govern data quality
- Track quality reviews, approvals across all stakeholders
- Perform Value Assessment and categories data quality issues
- Govern invalid data categories and take action and resolve data quality issues
- Remediate invalid values and create data repair scripts with CATfX
- Ability to run data quality assessments on Schedule
- Manage data quality issues to resolution
- Compliments existing DQ and MDM products
Data Quality Assessment Manager brings out-of-box features that offer incredible value to any data management organization. Key Features of Data Quality Assessment Manager
The DQAM provides the analyst with metadata and distinct values specific to each attribute for analysis for a comprehensive assessment to govern Data Quality Scores.
The data quality workflow organizes the assessment tasks into a structured flow for the various stakeholders, removing any ambiguity with regard to ownership and the sequential nature of execution of the task.
As a result of the comprehensive assessment, DQAM also allows for organizations to quickly remediate the invalid values based on the invalid data categories and generate data repair scripts or “fix it” scripts which can repair the invalid data where it originates.