Monthly Archives: May 2014

Why I Hate Help Desk Metrics

image
Photo: birgerking

Tech support, as many of you know, can be a grueling job.  There are a huge variety of problems, from frozen screens to document formatting issues to malware infestations to video display madness.  There are days when you are swamped with tickets.  And there are customers that continually broaden the scale from tech-averse to think-they-know-it-all. I’ve done tech support and I’ve managed tech support for most of my career, and providing good support isn’t the biggest challenge.  Rather, it’s keeping the tech support staff from going over the edge.

In our nptech circles, it would be natural to assume that having good metrics on everything help desk would assist me in solving these problems. Good metrics might inform me regarding the proper staffing levels, the types of expertise needed, the gaps in our application suites, all that good stuff that can support my budgeting and strategy. But once I start collecting them I open myself up to that imminent threat that someone else in management (my boss, the board, or whomever) might want to see the metric, too. They want to see are metrics like:

  • Average tickets and calls per day
  • Number of open tickets
  • Average time to resolve a ticket

Their idea is that these numbers will tell them how productive the tech support staff are, how efficient, and how successful they are at resolving problems.

Every one of these is a unreliable metric.  Alone or together, they don’t tell a meaningful story. Let’s take them one by one:

Average daily tickets: This is a number that is allegedly meaningful as it rises and falls.  If we have 30 tickets a day in January, and 50 a day in February, it means something.  But what?  Does it mean that IT is being more productive?  Does it imply that there are more issues popping up? Is it because more people are feeling comfortable about calling the help desk?  If we drop to 15 in February, what does that mean?  That IT has stabilized a lot of problems, or that the users have figured out that others in the org are more helpful than IT?

Number of open tickets: The standard assumption is that fewer is better.  And while that is generally true, it can be deceiving, because the nature of tickets varies dramatically.  Some require budget approvals and other time-consuming delays.  An assumption that tickets are open because the technician hasn’t gotten around to resolving them is often wrong.

Average time to resolve a ticket:  This one is deadly. Because it is commonly used as a performance metric, and that’s based on the assumption that the quicker all tickets are closed, the better service IT is providing.  The common scenario I’ve encountered where this metric is shared with management is that the tech support staff grow so pressured to close tickets that they regularly close them before the issue is truly resolved.  It creates tension with staff, as the real power of a help desk ticketing system is in the communication that it enables, not the communication that it cuts off when staff are not geared toward taking a communicative approach to issue resolution.

Worse, it takes away the technician’s ability to prioritize.  Every ticket must be closed quickly in order to look efficient, so every ticket is a priority.  But, in fact, many tickets aren’t high priority at all.  People often want to report computer problems that they aren’t in a hurry to get resolved.  When every ticket is treated like a fire to be put out, staff, naturally, start getting resistant to shouting “fire”, and stop reporting that annoying pop-up error that they get every time they log in.  They start living with all of the little things that they have inconvenient but bearable workarounds for, and as these pile up, they grow more and more annoyed with their computers — and tech support.

So what might useful metrics to assess the effectiveness of tech support entail? Here’s what I look for:

  • Evidence that the techs are prioritizing tickets correctly. They’re jumping when work stoppage issues are reported and taking their time on very low priority matters.
  • Tickets in the system are well-documented. We’re capturing complex solutions and noting issues that could be reduced with training, fine-tuning or a software upgrade.
  • Shirts are tucked in, hair isn’t mussed, nobody is on the verge of tears. High stress on support techs is usually plain to see.

The type of person that gravitates to a tech support job is a person that likes to help. There are egos involved, and an accompanying love of solving puzzles, but the job satisfaction comes from solving problems, and that’s exactly what we want our support staff to do. Creating an environment where the pressure is higher to close tickets than it is to resolve them is a lose-lose scenario for everyone.

Working With Proposal Requests Collaboratively

Okay, I know that it’s a problem worthy of psychoanalysis that I’m so fascinated with the Request for Proposal (RFP) process. But, hey, I do a lot of them. And they do say to write about what you know.

The presentation that I gave at NTEN’s conference in March focused on the process of developing and managing RFPs. I made the case that you want to approach a vendor RFP very differently than you would a software/system RFP. I pushed for less fixed bid proposals, because, in many cases, asking for a fixed bid is simply asking for a promise that will be hard to keep. ROI involves far more than just the dollars spent on projects like CRM deployments and web site revamps.

In the session, I learned that Requests for Information (RFIs), which are simpler for the vendors to respond to, can be a great tool for narrowing a field.  It’s important that clients are respectful of the fact that vendors don’t get paid to respond to proposals; they only get paid if they win the bid, and showing respect on both sides at the very glimmer of an engagement is a key step in developing a healthy relationship.

Since the conference, I’ve gotten a bit more creative about the software that we use to manage the RFP process, and I wanted to give a shout-out to the tools that have made it all easier.  There are alternatives, of course, and I still use the Microsoft apps that these have replaced on a daily basis for other work that they’re great at. But the key here is that these apps live in the cloud and support collaboration in ways that make a tedious process much easier.

Google Docs is replacing Microsoft Word as my RFP platform software. The advantages over Word are that I can:

  • Share the document with whomever I choose; the whole world or a select set of invitees. Google’s sharing permissions are very flexible. With Word, I had to email and upload a document; with Google Docs I only have to share a link.
  • I can share it as a read-only document that they can comment on. This simplifies the Q&A portion of the process, while maintaining the important transparency, as all participants can see every question and response.

We recently did an RFI for web development (it’s closed now, sorry!) and here’s what it looked like, exactly.

Smartsheet is replacing Microsoft Excel as my response matrix platform.

Example of a Smartsheet matrix

The first step upon receiving responses to a request is always to put them all in a spreadsheet for easy comparison.  Smartsheet beats Excel because it’s multi-user and collaborative. Since Smartsheet is a Spreadsheet/form builder/project management mashup app, I can add checkboxes and multiple choice fields to my matrix.

For simple proposals, you can also easily use Smartsheet to collect reviewer comments and votes. Just add a few columns (two for each reviewer). This puts the matrix and evaluation criteria all in one place, that can easily be exported to spreadsheet or PDF in order to document the decision.

Surveymonkey has replaced Excel for cases when the evaluation criteria is more complex than a yes/no vote. Using their simple but sophisticated questionnaire builder, you can ask a number of questions with weighted or scaled answers. The responses can be automatically tallied and, as with Smartsheet, exported to Excel for further analysis or published as charts to a PDF.

Consultant Selection Chart

As I’ve ranted elsewhere, making a good investment in software and vendor evaluation has a big impact on how successful a project will be.  Working with staff who are impacted by the project in order to choose the partner or technology increases buy-in and the validity of the initiative in the eyes of the people that will make or break it. And a healthy process insures that you are purchasing the right software or hiring the right people.  These tools help me make that process easier and more transparent.  What works for you?