The Flawed Nature of ITSM-Tool Peer Reviews Sites

ITSM Tool Peer Reviews Sites

Let’s talk about ITSM-tool peer reviews sites. Now don’t get me wrong, I’m a big fan of peer reviews – for instance I won’t entertain the purchase of anything on Amazon if it scores less than a four-star average in customer reviews. I’m also an advocate of enterprises seeking peer opinions when investing in new IT service management (ITSM) technology – why wouldn’t people want to know how existing customers are loving or loathing their ITSM tool?

However – and it’s the big “but” moment – I’m not so sure about the growth in the creation, promotion, and use of peer-review matrices in “helping” companies in their ITSM tool selection decisions. This blog explains why, in particular how accurately these matrices can, and do, compare things.

The source of this “soapbox moment”

I’ve watched silently as more and more of these ITSM-tool peer reviews matrices appear in my Twitter timeline – it’s usually an ITSM tool vendor promoting their position in a matrix. One that they just happen to do well in: “Do you know that we are a leader in the such-and-such matrix?”

You can’t blame them, as ITSM-tool marketing is sometimes a case of “keeping up with the Joneses.” Plus, the tool vendors who do well in these peer-review comparisons often don’t get included in Tier 1 analyst reports (yet).

But I had to break my silence, and risk alienating some of the companies that ultimately pay my mortgage, after I saw a tweet maligning an ITSM tool vendor based solely on its position in one of the many peer-review matrices. The tweet made me committed to taking a closer look at the reality of these ITSM tool comparisons.

The general issues with feedback requests

Think about the times you fill out feedback requests – do you fill out every one? I’d imagine not, your time is too important. You probably do what many of us do:

  • If it was a great experience, then you probably want to recognize someone who helped you – good, or great, feedback is given.
  • If it was a bad experience, then “heck, yeah!” you’re going to vent your spleen stating that you’ll never use that company or product/service again (guilty as charged).
  • If it was an “okay” experience, then you’ll probably not respond to the feedback request. You have more important things to do.
  • If it’s a long survey, then you’ll probably give it a miss. Unless there’s a prize on offer and it could be a “good gamble” – with better odds than buying a lottery ticket at least.

So, we only tend to give our feedback and opinions at the extremes of customer experience and when it’s made easy to do – with tick boxes, not text boxes, making life easier.

The potential issues with ITSM tool peer reviews

I’m going to throw out the “elephant in the room” at the outset. What’s to stop a motivated vendor from contacting who it knows to be “happy customers” to ask them to spend ten minutes completing a peer review for their product on the key ITSM-tool peer reviews sites? Or for the peer-review company to provide a service that contacts customers on the vendors behalf (BTW, this does happen). This might sound harmless but is it (if not everyone is doing it)? Your answer might be “Well everyone could do it if they wanted to, so what’s your issue?”

I’m glad you asked. Firstly, it’s sample bias – I bet vendors aren’t asking known “unhappy customers” to submit their reviews. It’s a game that can be played and, thus, the vendor/tool reviews are not representative of their customer population as a whole. Secondly, and I might be being a little over dramatic, this gaming of the system is akin to sports people feeling compelled to take performance-enhancing drugs because “everyone else is (thought to be) doing it.”

Of course, the creators of the peer review matrices do what they can to ensure that fake reviews aren’t included but what can they really do about “gamed” reviews? Where people are “encouraged” to submit a review – they’re customers after all. And the more reviews received, the better for the matrix creator, and does a sudden influx of reviews for a certain vendor necessarily mean that some third-party motivation is involved? Or that vendors with smaller market shares have the most reviews? I’ll leave you to ponder this one.

Where is the “trust” in peer reviews?

We inherently trust our peers, it’s a very human thing to do. Many of us also blindly trust what we see “in print” – just look at the ongoing hoo-ha around “fake news.” But beyond this, does Joe Public stop to think about, and possibly challenge, the real pillars for trust for such ITSM-tool peer reviews sites matrices? For example:

  • What’s the financial model? How do peer-review sites generate the revenue to pay for their operations (they don’t do it for love, like me writing blogs)? Who is paying what to whom? It might all be totally above board but surely some transparency would help with the credibility of the “research”? I think most of us know that, while Tier 1 analyst firms receive payments from vendors, most revenue is received from subscribers and thus these firms operate in the best interest of their subscribers. But is the same true for peer review sites (and ITSM.tools has the same issue in promoting “transparency”)?
  • Is there sufficient industry knowledge? What do the peer-review sites actually know about the ITSM tool market? How do they apply this to creating the right “algorithms” for differentiating between tools, and across technology categories? Or is the same algorithm used no matter the technology category? If the peer review provider is purely focused on ITSM tools, and similar, then great. But if it provides matrices across 800 disparate technology areas and matrices – yes, I counted them for a particular peer-review matrix provider – then do they have the right number and quality of people to ensure that their ITSM tool analysis is correct? Especially when the same scoring mechanism is used for all 800 matrices.

Diving deeper into an ITSM tool peer review example

As stated earlier, this blog is ultimately the result of a single statement based on a particular ITSM-tool peer reviews site matrix. And thus, this is the one I’ve chosen to look at more closely. It doesn’t mean that my observations reflect all peer review mechanisms – I’m just trying to prove that people need to take peer-review comparisons with the proverbial “pinch of salt.”

I’d also say that the same is true about any ITSM tool comparison, as what you see is based on assumptions, opinions, and particular use cases; and is not necessarily an exact match to your organization’s ITSM wants and needs.

So, let’s start to dig, with the source of my information and opinions a particular matrix from late Q1 2017.

The service desk matrix plotted:

  • Satisfaction – this is “based on customer satisfaction data sourced from real users’ reviews.” It doesn’t explain what this really means – you need to go to another page for the detail – but the matrix does show the star-based satisfaction for each tool. For instance, HEAT and LANDESK (now Ivanti) scored 3 and 3.5 respectively and were plotted close to CA Technologies (a 4-starrer) in the left-hand side of the matrix. Making the relative position not just the star score but something else. Even before seeking explanation, I bet on “number of peer reviews” looking at how the scores are spread across the “satisfaction” axis – the more reviews you have, the further your 4-star rating pushes you to the right. Given my earlier comment about gaming responses, is this really a fair plotting of customer satisfaction?
  • Market presence – this is based on “over 10 different social sources that indicate the products’ market share, vendor size, and social impact.” To be honest I’d no idea what this actually meant until I sought out the detail, but it did mean that HP Service Anywhere – a product that had been discontinued at the time (read into this what you will re market presence) – had a larger “market presence” than the LANDESK and HEAT products (whereas other analyst market share analyses have Ivanti in the top five in terms of market presence). And HP Service Manager had the largest market presence of all. I’m sure that even HP’s marketing team wouldn’t make these statements (although they would be in the top 10).

On the face of it, it seems to be the more reviews you have, coupled with the user scores, the better you position. And I thus refer you back to the ability for peer reviews to be gamed – getting as many reviews as possible from happy customers.

I’m not saying it definitely happens but how else would ITSM tools that many of you have never heard of get more reviews than ITSM industry stalwarts on ITSM-tool peer reviews sites? The matrix just didn’t look right to someone who has been following the ITSM tool market for the best part of a decade (and of course the matrix I reviewed will have changed since).

The ITSM tool scoring methodology

The scoring methodology was available on a “transparency”-type page. For customer satisfaction, in “order of importance” it was:

  • Customer satisfaction (based on user reviews) – which is fair
  • Popularity based on the number of user reviews received – which can definitely be gamed
  • Quality of reviews – again gameable, just get the customer to write lots
  • Age of reviews – again gameable
  • Customers’ satisfaction with product attributes – unsure how subjective this is or how it’s scientifically gauged
  • Overall customer satisfaction and Net Promoter Score (NPS)

Of course, I dropped down into the individual user reviews when considering these parameters but have yet to see anything other than the stars, submitted text, and review submission-counts.

As for the market presence metric, this was:

  • Product market presence – employee numbers, number of product reviews (again), and product social-impact based Twitter followers and “domain authority”
  • Vendor market presence – the number of company employees (based on social networks and public resources), vendor “momentum” based on web traffic and search trends, vendor social impact based on social measures, age of company (not the product?), and employee satisfaction and engagement (based on social network ratings).

My question to you is: “Is this really market presence?” At best, it’s social media presence and the building of a “house of cards” using third-party social media analysis which is in itself somewhat shaky. Market presence should be based on product revenues, customers, or seats/subscriptions; and by all means the movement, up or down, over time can be represented.

To bring this overly-long blog to a close, I want to make and underline my point – the reason why I wrote the previous 1800 words. I’m all for ITSM-tool peer reviews but don’t get misled by the use of the individual reviews to create something that’s probably not a fair refection of the truth. My point is aimed at both enterprises wishing to invest in a new tool and ITSM tool vendors wishing to “big up themselves.” If you truly understood it, you wouldn’t share it. If you don’t understand it, then you shouldn’t share it.

That’s the sharing of what is most-likely misinformation and, however well intended, it isn’t good for anyone (maybe other than its creators).

If you want more content like this, please subscribe to our newsletter email. Or follow ITSM.tools or Stephen Mann on Twitter.

Want more? Here are informative ITIL 4 service value system and ITIL 4 service value chain articles.

Stephen Mann

Principal Analyst and Content Director at the ITSM-focused industry analyst firm ITSM.tools. Also an independent IT and IT service management marketing content creator, and a frequent blogger, writer, and presenter on the challenges and opportunities for IT service management professionals.

Previously held positions in IT research and analysis (at IT industry analyst firms Ovum and Forrester and the UK Post Office), IT service management consultancy, enterprise IT service desk and IT service management, IT asset management, innovation and creativity facilitation, project management, finance consultancy, internal audit, and product marketing for a SaaS IT service management technology vendor.

Want ITSM best practice and advice delivered directly to your inbox? Why not sign up for our newsletter? This way you won't miss any of the latest ITSM tips and tricks.

nl subscribe strip imgage

More Topics to Explore

2 Responses

  1. Your article is interesting, even though you tread very lightly. I think I might have something to do with why you wrote this article. (If it is my tweet you are referring to in the text, I feel you misrepresent the chain of events.) As you know we don’t share the exact same view on this topic. I find that peer reviews are useful – also for ITSM tool selection. So I’ll weigh in as well.

    We certainly agree on one thing. It doesn’t really matter where the tool is positioned in the peer review matrix. But if you take the time to read the reviews, you will find useful input.

    As you also point out, Stephen, you can discover new vendors and tools in the peer reviews, which may not be picked up on by the major analysts yet.

    Here are my thoughts:
    https://www.linkedin.com/pulse/dont-select-your-itsm-tool-without-considering-crowd-reviews-henrik

Leave a Reply

Your email address will not be published. Required fields are marked *