Delivering improvement in data quality for one of the largest electronics companies in the world
The Big Picture
One of the world’s largest electronic companies was challenged with poor data quality and ineffective data compliant systems. This led to error-prone business processes causing issues like incorrect customer invoices and late payment due to invoice errors, among others. The existing process for data quality improvement was slow, manual, error-prone, and difficult to scale.
Fractal developed a solution that drove data quality improvement across the enterprise. The platform can do data preparations on large volumes of data, data rule calculations and is easy to manage with its front-end dashboards, capturing data quality insights across multiple business dimensions.
The client partnered with us to build the foundations of insightful data management and reporting. Fractal created a solution that could turn information quickly into insights and then present these insights for important business decisions. The entire solution’s backbone was the data rules configuration that was used to identify defective records in data.
The solution is set up with a back-end engine for data extraction, data preparation, and data rules configuration on various data sets. It’s created as per definitions provided by the business, the design of output structure according to dashboard requirements, and facilitates the storage of results that can be fed to multiple sets of dashboards.
The focus was to have an optimized process with minimal processing time, parallel processing, and data aggregations to reduce the storage space and provide the user’s best dashboard performance.
The platform’s back end was set up on Qlikview and Hadoop, while the front-end dashboards were on Qliksense with an on-demand app generation feature, enabling users to see detailed reports in just a few clicks. The complete set up of the big data platform was done on Hadoop on Azure/AWS, making it scalable for future requirements.
The solution successfully transformed the data cleansing operation with faster reduction of data defects, improving data quality, and enabled tracking key KPIs like Asset score, Coverage, and Conformance.
A rule-based approach to address data quality issues became easy with a single platform for integrating cross-domain data from multiple sources. The client was now able to:
- Get week on week comparison of KPIs,
- Track the impact of data cleansing activities on existing data rules,
- Identify the requirement for new data rules,
- Drive improvement in data quality by focusing on top Assets (customers, vendors, products), Critical data fields, Asset score, Coverage, and Conformance.
The solution has been placed as a central tool within the organization to drive data governance strategy, leading to data quality index improvement of 1% and reduction in 4.3 Mn data defects in 2019.
- Insights delivered from this solution has helped the client with:
- Active correction of data at the source by the data cleansing team,
- The executive team could monitor improvement for critical KPIs across multiple domains and take strategic decisions,
- Absolute flexibility leveraging big data platforms and accommodating a broader scope.