Have you spent enough time and energy assessing the quality of your IT service management (ITSM) data? Take the uptake of cloud as an example. Cloud adoption has been steadily increasing over the last decade, resulting in cost savings for businesses. However, moving to the cloud raises new concerns. IT wants to know that the cloud is secure and whether a cloud app will talk with an on-premises database. Then, from a business perspective, with all the changes will the company be any closer to using data to achieve a complete and accurate picture of the company and its customers?
The US government knows that moving systems to the cloud brings better security, more cost savings, and faster delivery of services. So, taking a page from the corporate-America playbook, they overhauled their approach to cloud computing this past fall, acknowledging that the federal government should learn from and mimic the successes of the private sector. With this including improved data governance for hundreds of government business units.
Please read on to find out why you, as an ITSM leader, need to address your organization’s data quality issues, including poor data quality in ITSM.
Low-quality Data Is Bad for Business
Even without overseeing the data governance of hundreds of business units in the government, ITSM leaders can see that inaccurate and inconsistent data is expensive for their companies.
It’s a problem that organizations have been weighed down by for a long time. The average corporation’s data increases 40% per year, with 20% of a typical database having bad data. The cost of data-quality issues is therefore multiplying.
More recently, in March 2019, Forrester reported the challenges that integration decision makers face. Among a list of 11 challenges to a company’s integration strategy, decision makers ranked the challenge of data quality as the second most challenging concern (the first being data privacy and security).
The Pursuit of Real-Time Data Exchange
A major reason for low-quality data in ITSM is that much of it is obsolete – data that is inaccurate or no longer relevant. Business happens – tickets get resolved, users change positions and departments, new knowledge articles get written, and customers express new preferences. When you rely on nightly batch jobs for transferring data to a database, you work with obsolete data.
Dynamic updates, however, help to ensure that personnel are in the know. If analysts suddenly deal with a flood of support tickets because of an outage, real-time transfer of knowledge articles for dealing with that outage can reduce incident resolution time, improve analyst morale, and keep customers happy. Good data leads to good decisions.
What Digital Transformation Means for Data Quality
The spotlight on data quality has intensified with the rise of digital-transformation pursuits. Rather than applying patches to seemingly weak areas in an organization, CIOs realize they need to embrace digital transformation in order to compete. Digital transformation is a total re-engineering of a corporation’s technological and business processes, in order to meet elevated customer expectations.
One key theme in digital transformation, according to the World Economic Forum’s Digital Transformation Initiative, is big data analytics. The emergence of big data means new opportunities for companies to unlock value. When IT and analytics teams apply BI tools to these data repositories and warehouses, they have real-time visibility into statuses and trends of their customers and their company. With such insight, business leaders make informed decisions, and companies become more agile, ready to compete in a dynamic landscape.
But those business decisions are only as good as the data they’re based on. And according to Gartner, data of unacceptable quality continues to influence 97% of business decisions. At the same time, just 37% of companies have implemented a data-governance strategy.
A Chance to Get Ahead
Because of the cost of bad data to an organization, IT leaders devote significant resources toward finding solutions. Data cleansing tools and Master Data Management (MDM) solutions provide some relief. To preserve and enhance data quality and security, companies are also increasingly seeing the value of data integration solutions, the market for which is growing with a CAGR of 14.3%.
Bad data quality is a pervasive ailment for every corporation. Although dealing with any bad data is unfortunate, those companies that improve their data quality give themselves a competitive edge.
Do you have any ITSM data quality horror stories to share? If so, please let me know in the comments.
Alfredo is the Content Marketing Manager at Perspectium. He tells stories about how businesses overcome service-management challenges to create delightful customer experiences (via the creation of case studies, white papers, infographics, videos, blog posts, and other media).