Someone asked me recently, “Why should we bother with data quality technology?” They had attended a presentation where data quality software was criticised as ineffective for long-term data quality improvement and it should really be considered a cost centre.
There are lots of different reasons why I personally believe data quality tools make great sense but by far the most compelling for me is the absolute need for gaining control of your processes before you embark on long-term improvement.
Process Control First – Then Improvement
When you come to examine some of your core service processes for the first time there is a good chance they will be either extremely or partially out of control. By this I mean that there will be a degree of variance in the outputs delivered.
In a service process we may witness this as sporadic delivery durations, variable quality of goods received, intermittent or inaccurate communication and incorrect billing documentation. Our long-term goal may well be to increase our overall performance and create a “Best in Class” offering within the marketplace but our starting point has to be reduction of variance. Our customers have to know that if we say our service will take 5 days, it will be delivered in 5 days. If we bill them, the bill must be accurate and so on.
This is where data quality tools, I believe, are outstanding when first implemented. They provide a huge amount of stability in trying to tackle process control issues. They help you spot issues that increase lead times and reduce the quality of the final product or service. By using data quality technology in a reactive, fire-fighting mode initially you can start to smooth out some of the high variance you would have witnessed before.
Process Improvement for the Long Term
Removing variance in your processes helps you gain support and momentum for the next phase which is the wider improvement of the overall process. This may involve not only data but working practices, applications, people skills and so on. By trying to improve everything upfront whilst eliminating variance at the same time you can quickly become swamped. You need to have everything working within the necessary boundaries of current performance before you can take the performance to the next level.
There are countless other ways that I’ve found data quality technology to add value but the ability to monitor and smooth out process variation has been the most important for me but what about you? How has your organisation benefited from data quality software? Welcome your views.
the Data Roundtable.