Listen to this post
Call monitoring forms and call quality scorecards are the go-to tools in phone-based businesses for measuring employee performance, discovering compliance issues, and gaining a deeper understanding of customers.
When used correctly, call evaluations bring transparency into the conversations that your reps are having with customers. They also promote high-quality interactions with your brand by setting clear goals for the outcome of each call and create accountability for employees.
The opposite is true when used incorrectly. Agents will get confused and frustrated when expectations are muddy, goals are unreachable, and information is limited.
In this post I’ll cover
- Common mistakes to avoid when developing a scorecard for your contact center
- How to outline a kick-ass QA form
- How to distill your outline to individual monitoring parameters
- Give you a free template for building your call monitoring forms
6 Mistakes to Avoid on your Call Monitoring Forms
Here are some of the common mistakes we run across when our customers set up call scorecards. (i.e. Don’t do these things)
1. One size fits all scoring
Different call types should be measured differently.
Sales is not the same as support. So of course they shouldn’t be measured the same.
This goes for all of your different business functions that communicate with customers, and if you’re an outsourced contact center or a BPO this applies to different projects and brands, too.
2. Too many metrics
This is just confusing for everyone involved.
Pick the most important metrics and try to keep it less than 15.
Can you imagine being a front line rep and trying to hit 30 key points each call?
You’d just be adding more stress to one of the most inherently stressful jobs around.
3. Too rigid
Don’t restrict the natural flow of conversation by requiring that every metric be hit at a specific point in the call.
Some of this ok. Reps should certainly disclose that the call is being recorded in the first few moments of the conversation.
Requiring a rep to say the phrase “I can certainly help with that” Every Single Time a customer asks a question, though, is just ridiculous and sounds unnatural to customers.
You can’t get reliable measurements or expect consistent behavior from employees if everyone interprets a metric differently.
If you’re dead set on using the metric “How awesome was this call”, you should at least have a call recording library of awesome calls and less awesome calls that all of your employees can access.
That way everyone else can hear first-hand what you think is so awesome.
5. Impossible metrics
There are some things that your contact center agents just don’t have any control over.
Please don’t hold them accountable for a customer’s happiness.
Sometimes a customer is just going to be pissed off and stay that way. No matter what your rep does.
A much better metric would be a customer effort score or a customer satisfaction score. It’s better to measure how well the agent did her job instead of how happy the customer was.
Because a customer who’s having a bad day can still be satisfied with the help they were provided.
6. Rely on manual scoring alone
Human intuition and insight are incredibly important but also very expensive to scale.
Most companies try to use manual call scoring as their first and last efforts to manage quality and inform coaching.
Unfortunately, those efforts are usually wasted on calls that don’t matter.
The most effective solution is call monitoring software.
One that transcribes 100% of calls, scores them based on keywords and phrases, enriches the data with customer feedback and key metrics, and streamlines manual evaluations to target the highest value conversations. (Yeah, this is a shameless plug for Voxjar.)
There are obviously more mistakes that we could list, but at Voxjar we like to focus on the positives.
So from here on out, I’ll cover best practices for crafting awesome QA forms for call monitoring.
Outline Your Scorecards
The first thing you should do when developing a call monitoring form is spend time thinking about what your goals are. Do this on a high level before getting buried in the details of individual metrics.
- Do you want to increase the quality of phone calls? If so, what does that even mean to your organization?
- Is reducing friction in your customers’ journey top priority?
- Are you concerned with compliance?
A little brainstorming here will go a long way when developing monitoring forms that deliver actionable data from your reps’ phone calls.
My favorite way to do this is to grab a whiteboard and get every idea I can come up with written down. Don’t worry about being super organized yet. The most important thing is to get your wheels turning.
If you have multiple call types (i.e. sales, support, multiple projects, etc) then you’ll want to do this for each one. It’ll be easier each iteration.
Do this as a team. Collaboration will get you further a lot faster. Involving your front line reps will bring unique and diverse opinions.
Once you’ve filled your whiteboard with ideas, you’ll probably start to see some patterns that make it easy to group similar ideas and rank them by importance.
Your list might look something like this:
- Why did they call?
- How did they hear about us?
- Did the agent achieve first call resolution?
- What are customers’ most common objections and how do reps overcome them?
Behaviors to promote in reps:
- Script adherence
- Verifying customer information
- Legal compliance
- Tone of voice
- Asking good questions
- Product knowledge
- Cross-selling and upselling
- Following the right sales process
- Save notes post-call
Behaviors that need to be caught and dealt with right away:
- Legal threats
- Dodging calls
It’s important to trim this list down if you want your reps to get on board. Too many parameters and they’ll start to feel helpless and lose motivation.
Call Monitoring Parameters
Now that you know what you’re trying to achieve with call evaluations and quality assurance it’s time to break down the bigger ideas into individual behaviors that you can measure.
In the quality monitoring form template below, we’ll include a list of common parameters to help spark some ideas.
I suggest picking and choosing from that list, where you can, to help fine tune the behaviors you came up with in your brainstorming session.
You’ll also probably find that you naturally brainstormed some metrics that are already easy to measure.
Make sure that the parameters are clearly understood by everyone involved. Especially front-line employees.
If a parameter is ambiguous or too subjective you’ll run into reliability problems with your scores. Take special care to clearly define metrics like tone of voice, rapport, empathy, etc.
A sign that a metric might be too subjective or is ambiguous is a wide variability in scores.
- A rep is scored 3/10 on “tone of voice” because she wasn’t upbeat enough on the call according to her manager.
- She gets 6/10 from her direct supervisor because she kept a calm demeanor throughout the call but could have been more enthusiastic.
- The customer fills out a satisfaction survey and rates the rep 10/10 and says that she was “very friendly”.
Subjective scores like this happen a lot in one form or another.
You might be wondering how to prevent this now.
This might feel counter to what you’ve read so far but stay with me.
Once you’ve done everything you can to make your metrics concrete, and you’ve defined exactly why they matter and how they contribute to the overall goals of the company…
…There will still be some subjectivity.
Subjectivity is inevitable when people are involved, and it’s not necessarily a bad thing. You want managers and supervisors to draw on their personal experiences to judge performance. That’s why they’re there.
Some things certainly should be measured objectively. Like compliance and script adherence.
These two metric types aren’t mutually exclusive, though.
Combining them is where the magic happens.
The problems show up when a metric is misunderstood. Which brings us to my next point.
Subjectivity is often only a problem when the parameter isn’t well defined.
Here’s the key:
A subjective score can be ok. An ambiguous definition or understanding of the metric is not ok.
An example of this would be referees. Every professional ref has a crystal clear understanding of the rules of the game.
But two refs watching the same game can, on occasion, make different calls. Their perspectives may differ slightly even though their understanding of the rules of the game are the same.
For a game to go smoothly, the refs, coaches and every player need to have a clear and uniform understanding of the rules of the game they’re playing.
Similarly, if your managers, quality assurance team, supervisors, and agents all have different opinions on what the metric “awesomeness” means, then you have a problem.
Often, the solution is to clearly define what it means to have an “awesome” call and coach accordingly.
If being awesome includes showing empathy, asking clarifying questions, and building rapport then you should spell it out to each employee or, better yet, break it into separate metrics.
Rely on customer feedback.
In our earlier scenario, the agent received customer feedback.
If you’re collecting customer satisfaction or net promoter score data, it should absolutely be included when evaluating calls.
You might not have this data for every call but when you do it should be heavily weighted.
As a final point, keep testing.
Your forms should be living documents. If the metrics on your scorecards aren’t correlating with business goals, then you should rework the forms.
Free Call Monitoring Form Template
I promised you a template and you better believe I’m gonna deliver. Here’s what I have for you.
- A free call monitoring template via Google Sheets.
- All the benefits of Google Drive (share, collaborate, easy access, etc.)
- Pre-built outline of a scorecard that is easy to expand
- Code that will save scorecard results and clear the QA form at the click of a button
- Instructions on how to expand the sheet and maintain the code functionality
- A PDF with 101 call scorecard parameters
All because I like you.
If you’re short on resources, this template will get your call monitoring and quality assurance efforts off the ground ASAP.
Click the link at the bottom of the page and we’ll email you access to the sheet.
Download the Call Quality Scorecard Template
Click the image below to get your Google Sheets call monitoring form template and a pdf of 101 call quality monitoring parameters.
P.S. keep Voxjar in mind when you’re ready to upgrade, we built it just for you.
- Automatically analyze and score 100% of calls
- Build custom call monitoring forms for manual reviews
- Easy access to every call recording for monitoring and coaching
- Custom analytics dashboards for all the important stuff
- Annotate and share call recordings