Continuing my post-Interop thinking and blogging, I’ve subconsciously joined the dots between four discrete Interop events around IT insight: A comment made by Harper Reed in his Thursday keynote – that “Big Data is BS”, that what we need is “Big Answers.” A presentational session from Chris Pick, of Apptio, called: “IT Benchmarking: Why You Don’t Do It But Should!” An Expo Hall conversation on performance management with Chris Dunn of ScienceLogic – one of many exhibitors offering capabilities related to IT performance management. And my earlier Interop blog that showed little mention of Big Data in either conference sessions or exhibitor blurbs.
In the interest of openness, I know both Chris’ from my days as a Forrester Analyst where I would commonly list the three über challenges for IT as dealing with:
- Increased business scrutiny (over what IT costs and the business value the annual invest in IT delivers)
- Increased business and customer/user expectations (over the quality of IT services and the overall service experience)
- Increased business and IT complexity.
So forgetting about the now-quietening hoopla over Big Data and the need for squillions of data scientists, just how good is your average corporate IT organisation at analysing its IT performance data?
The complexity of IT management in the cloud era
If we look at the changing IT landscape, particularly the continued adoption of cloud, today’s IT environments are more interconnected and complex than ever. With IT environments hybridising (I believe this is a word) at an ever-increasing rate – for instance, with on-premise virtual machines are leveraging cloud storage, and cloud-based virtual machines are using on-premise storage.
And while cloud provides IT organisations with greater flexibility and scale, the many benefits of automation, and potentially access to higher spec and lower cost IT capabilities, the new IT environments bring with them risks around:
- Network and IT service disruption
- Data loss
- Disrupted compute
- Cloud sprawl and the associated costs
Of course IT vendors such as VMware, Cisco, NetApp, and Amazon provide their own management tools for managing their products/domains, and the issues within them. But if we have truly moved on from the 1990’s technology-domain view of IT, and IT management, then we need insight into how well IT services – rather than IT components – are performing. And, if they aren’t performing as needed, insight into the business impact, the customers affected, and the real underlying issue or issues.
Getting insight (into not-so-big data)
Corporate IT organizations need to be able to rise above the siloed vendor product/domain view to understand the relationships between, and dependencies of, IT components and services, to be able to measure performance from a service delivery point of view, and to quickly get to (and address) the root cause of any issues.
And this is just one part of the complicated cloud service delivery conundrum.
Another is cost. And you can probably hear my soapbox being pulled out from under my desk.
IT business management – three words that still need to sit together better.
Or “technology business management” as Chris Pick, and his colleagues, would call it. Plus IT business management (or IT financial management) should be, in my opinion, part of the larger IT management discipline. But I’ll move on from the terminology, the important thing is the need for better insight into IT costs and the delivered business value.
As a closet accountant, having qualified as a management accountant in 1994, I continue to be dismayed at the lack of financial understanding and stewardship in many corporate IT organizations. I’ve lost count of the number of times that I’ve spouted something akin to:
“People say that IT should be run as/like a business, but how many businesses would survive without a deep understanding of their market(s), customers, products and services, and product costs and margins?”
My point being that too many corporate IT organizations struggle, or don’t think they need, to understand what it costs to deliver the products and services they provide. ITIL, the IT service management (ITSM) best practice framework formerly known as the IT Infrastructure Library, calls this service costing.
But Chris gave me hope during his presentation:
The not-for-profit Technology Business Management Council (of which he is president) now has over 1250 members; and borrowing one of Chris’ Interop slides, the benefits are definitely out there for organizations that finally wake up to the fact that they can’t continue to manage IT service delivery with a limited, and most likely manual, ability to understand and manage IT costs.
So for me, the two Chrises were talking to the same point from their own perspectives. That many corporate IT organizations need better insight into their IT operations. Not at a technology level but at a business level – particularly in respect of service quality and costs (and ultimately business value).
So, in all the hoo-ha about Big Data, have we been doing enough with our not-so-big data? The long-forgotten accountant in me thinks not – better insight into IT operations and costs leads to better decisions and better business.
This blog was originally written for Computer World UK in 2014. You can check out the original version here.
Principal Analyst and Content Director at the ITSM-focused industry analyst firm ITSM.tools. Also an independent IT and IT service management marketing content creator, and a frequent blogger, writer, and presenter on the challenges and opportunities for IT service management professionals.
Previously held positions in IT research and analysis (at IT industry analyst firms Ovum and Forrester and the UK Post Office), IT service management consultancy, enterprise IT service desk and IT service management, IT asset management, innovation and creativity facilitation, project management, finance consultancy, internal audit, and product marketing for a SaaS IT service management technology vendor.