I led an IT services company for twenty-five years — and for most of that time, we had a goal of achieving our response time and resolution time goals on 95% of our tickets. I’ll refer to these goals collectively as SLA (Service Level Agreement) for the rest of this article.
When we implemented EOS in 2013, that goal made it onto our scorecard and stayed there for a very very long time. And that was a mistake. I’ll identify three issues with this scorecard and share three solves. While reading, think about your scorecard — I’m confident you can use my mistake to find opportunities to improve your scorecard.

Table of Contents
- Issue #1: This intended outcome for this scorecard isn’t clear
- Issue #2: The scorecard should drive future action
- Issue #3: Percentages are suboptimal
- Solve #1: Be clear on why your Measurable is on the Scorecard
- Solve #2: Drive future action
- Solve #3: Avoid percentages or ratios when possible
- Putting it all together
- Your turn
Issue #1: This intended outcome for this scorecard isn’t clear
The first issue with the SLA scorecard: we weren’t quite sure why it was there. 95% was used for so long each leadership team member would likely have a unique perspective on why. One person might say, “it tells us if we provided good service last week.” Another might say, “we can tell whether or not our team is managing tickets well.” Or a third may assume, “we use that number to know if our clients are happy with us.” No one is wrong. No one is right. We’re measuring everything and nothing all at once — and that’s not good enough.
When we aren’t clear on why a scorecard number exists, the first step is simply ask your team — “What’s the outcome we’re hoping to impact when we do a great job on this scorecard number?”
Issue #2: The scorecard should drive future action
Another issue with our SLAscorecard: the metric was rooted firmly in the past. When last week’s result was below target, we spent our time as a team discussing why last week happened. Quite often, we found excuses in last week’s unique set of circumstances (there were so many tickets! or 3 people were sick! or… the list goes on). Often, we’d end up simply agreeing we needed to do better next week without much specificity.
We could and should have done a better job IDS’ing the issue, but when the scorecard number isn’t helping you orient towards the future, it raises the difficulty for your IDS.
Issue #3: Percentages are suboptimal
One additional issue with our SLA Goals scorecard: the number was not simple. The team had to understand both the total number of tickets (denominator), the quantity of responded-to or resolved tickets (numerator), and then also comprehend the relationship between those numbers (the percentage). Inevitably, the first questions turn to how and what we’re measuring — when was the ticket created? are we only counting tickets created and closed in the week? should we count really old tickets? Too easily, we were able to dismiss a low SLA Goals result by rationalizing the measurement of the number.
When our teams need precious IDS time to understand, clarify, rationalize, or even argue over the equation for the Scorecard metric, it’s a good sign that metric is due for a tune up.
Solve #1: Be clear on why your Measurable is on the Scorecard
In this example, the answer was simply“Are clients happy?” We had numerous ways to understand client satisfaction – ticket surveys, Net Promoter Surveys, etc. But ticket performance — responding to and resolving tickets — was the main driver of client happiness, and ultimately client retention (and that’s why SLA is relevant to the are clients happy? question). But the thing that made them really unhappy and more likely to cancel services was when something really significant happened (we called those Priority 1 / P1 tickets) and our support team didn’t do a good job responding to or resolving the problem.

With that clarity, we added a scorecard metric that specifically addressed SLA Goals on P1 tickets. Not a percentage, but simply “How many P1 tickets missed their SLA Goals last week?” With that simple scorecard metric, we knew which clients needed extra attention the following week to get the relationship back on the right track.
For each item on your scorecard, your team should know why it’s there with specificity. When a number on your scorecard doesn’t have a clear answer (or has lots of answers!) to the question “Why is this number on our scorecard?”, your team must get on the same page. If this is the root of your scorecard issue, have a quick discussion, make a decision and get aligned – then update your scorecard!
Solve #2 — Drive future action
When thinking about a good week, one key was the simple idea of how well we managed tickets. We knew we were managing tickets well — good systems run by people who care — so as we dug a bit deeper and came to the realization the #1 issue with managing tickets well was being staffed appropriately. Instead of measuring last week’s SLA Goals, we adjusted the scorecard to look at the upcoming week’s schedule.

The scorecard metric we settled on was “How many uncovered shifts on the help desk do we have next week?” — anything more than 0 got sent to the issues list to solve. We ensured the scorecard metric drove future action with clarity.
The scorecard should give your team absolute clarity on what’s going well and not going well within the organization every week. With that clarity, your team can use the scorecard to make next week better.
Solve #3 – Avoid percentages or ratios when possible
We still needed to measure our performance — and SLA is still a good measure of good work. Our clients consistently told us they loved the service they received, but that didn’t match directly with SLA results. When digging into this scorecard a bit more, we found this wasn’t a “global” SLA issue, but instead was limited to specific teams and team members.

We kept using SLA data as a source, but the scorecard metric changed to “How many teams didn’t hit SLA?” With more specificity, we could address the team directly, dive deeply into the issues they faced to ensure they made the right adjustments to achieve SLA the following week.
The scorecard should give your team the right insight to specifically understand what’s happening in the business. Avoid a metric requiring you to spend time on anything other than solving the issue.
Putting it all together
We ultimately replaced two difficult-to-solve measurables with three easily actionable numbers. Just look at the difference:

Not only have we solved the issues that plagued the earlier version, this scorecard also tells a clear story. Understaffed weeks can be predictors of SLA issues, both for the support teams and for clients — now the team can act with clarity and confidence to resolve staffing issues. With more specificity on SLA problems, the team can uncover client satisfaction issues and work with underperforming teams quickly.
Your turn
Which measurables are due for a tune-up on your scorecard? Simple changes can make a huge impact. If you feel stuck, I’m happy to help — use this link to schedule a 30-minute zoom. I also host a weekly EOS-focused podcast called All 10’s, and am active on LinkedIn and would love to connect with you!

Clay Harris is a professional EOS Implementer with a long entrepreneurial background. He started an IT services business his senior year of college and scraped and clawed their way to $5M in revenue and 30 team members before hitting the ceiling.
After his team found and implemented EOS®, they were off to the races, growing that business to $20M+ revenue and 100+ team members with their newfound Traction®.
Over that time, Clay fell out of love with IT services and in love with coaching leaders, so he took the leap to becoming an EOS implementer. Learn more about Clay and how he brings his deep entrepreneurial expertise to his clients on his website.