A resource for customer experience (CX) and experience management (XM) professionals.
Subscribe on Apple Podcasts Listen on Spotify Listen on Stitcher Listen on Stitcher Listen on YouTube

The CX Conundrum of Benchmarking

Release Date: March 26, 2024 • Episode #309

What are some of the more common challenges you encounter with your CX program? Have you noticed a particular common problem among CX programs when you converse with fellow CX pros? Benchmarking is an often-tackled problem for customer experience professionals and is not as simple as comparing your numbers with another company’s scores. This episode is the inaugural show of a new series we’re calling “CX Conundrums,” where we’ll address common CX challenges and possible ways CX leaders can address them.

Host Troy Powell dives into this conundrum with two seasoned CX professionals that have a great deal of experience in benchmarking: Sean Clayton who has 20+ years working with a wide range of companies and James Bampos whose background includes CX at various tech firms as well as leadership on a recent benchmarking initiative with the XM Institute.

James Bampos

James Bampos
Connect with James

Sean Clayton

Sean Clayton
Connect with Sean


It’s not apples to apples comparisons

Sean: “The C-suite is very used to being able to compare themselves to, particularly if they’re publicly traded, to other companies on on financial metrics, and have a natural expectation that that’s going to be the same when they get introduced to a customer experience metric. I mean that out on the Gulf Coast, comparing that their golf swings to other CEOs. So I think it’s a natural thing. And and the challenge is that there are all kinds of pitfalls with benchmarking in a customer experience setting that make it incredibly difficult to get a true apples to apples comparison.”

Beware the pitfall of comparisons

James: “…one of the pitfalls I’ve seen also, and I think you both mentioned it, is this obsession with benchmarking at an executive level. They have to compare themselves somehow, and when CX is introduced, there’s often a disregard of the validity of what they’re looking at. And there’s a lot of conclusions – they come right away. They’re not accurate. So I think again, setting up what are you looking at – what does this mean – you know, is a really important part of this exercise because otherwise you’re going to be caught in that never ending cycle of having to produce benchmarks that may not represent your business or the performance you’re trying to obtain.”


You gather a ton of data about your customer experience, and it provides great insights into how your program is performing over time. But how do you know if you’re in the ballpark compared to other organizations?
And especially CX leaders, we live in this world, so we’re thinking benchmarking a lot. And when we present, we often don’t step back for a moment and say, what are we doing? What does this mean? Why are we here and what is successful?
Let’s look at the CX conundrum of benchmarking on this episode of The CX Leader Podcast.
The CX Leader Podcast is produced by Walker, an experience management firm that helps our clients accelerate their XM success. You can find out more at walkerinfo.com.
Hello everyone. I’m Troy Powell, host of this episode of The CX Leader Podcast, and thank you for listening. It’s never been a better time to be a CX leader, and we explore topics and themes to help leaders like you develop great programs and deliver amazing experiences for your customers. What are some of the more common challenges you encounter with your CX program? Have you noticed a particular problem amongst CX programs when you talk with fellow pros? Well, we’re going to tackle some of those issues in a new series we’re calling CX Conundrums, where we’ll address common CX challenges and possible ways CX leaders can address them. For this inaugural episode, we’re going to tackle a common conundrum that comes up often with our clients: benchmarking. To delve into this topic, we’re joined by two seasoned CX professionals that have a great deal of experience in benchmarking and everything related to CX measurement and management. First is my colleague here at Walker, Sean Clayton. Sean has 20 plus years experience working with a wide range of companies on CX topics, and has some experience working on the client side as well with these topics. My other guest is long time friend James Bampos, whose background goes way back in leading CX initiatives at some top tech firms in the industry for quite a while and now is switched over to more of a consulting role. So helping take all of that experience, as he puts it, the the scars from those experiences as well, and bringing them to people who are leading their own programs and providing that advice and consulting that we all need to move our programs forward. So, Sean James, welcome to The CX Leader Podcast.
Thanks Troy.
Good to be here.
Excellent. So you know, one piece. We definitely want to get into a lot of detail. And I think you also want to call out here, James, you worked recently on a B2B benchmarking report for Qualtrics that we’ve used pretty extensively with some clients recently and and has an interesting perspective. So I want you to be pulling some of that out as, as we move forward. But you know where I want to start us, maybe. Is that all too familiar place that you both have been directly in? I know, but then also all of us have, you know, been involved secondarily in helping clients with this, where you are presenting some CX results to key stakeholders in your organization and they see a score. The question is, well, is that good? Like, is that a good score? Should we be happy with that? You know, how does it compare? Those questions inevitably come up. And so then the CX leaders in this position of all right well how do I answer that? I think traditionally what I’ve seen people have, you know, gone directly to, oh, I need a benchmark to compare that score to say, yeah, we’re good or we’re bad or, you know, we need help. Um, and that’s definitely one answer. But I want us to obviously be thinking more holistically about that. So, um, so, James, let’s start with you. Like, you know, that question, that situation, what are some ways you like to think about answering that question?
Yeah. I mean, I think we have to start with what do we mean when we talk about benchmarking, right. I think there’s a foundation that we often don’t set. And especially CX leaders, we live in this world. Right. So we’re thinking of benchmarking a lot. And when we present, we often don’t step back for a moment and say, what are we doing? What does this mean? Why are we here and what is success look like? Right? So often when we start with benchmarking and we’re talking about performance or process benchmarking, right? Two different components performance in terms and NPS, CSAT process in terms of for example, what’s your average response rate to an email survey. Right. And then we look at how we use it internally, externally. And I think when we present benchmarks, if we all understand what we’re presenting and why performance, for example, in this area of the business should reflect our performance against whom? Right. And I think oftentimes, again, setting that stage of what we’re presenting and why. And then finally, what are the expectations, right. What do you do with this? So what? Right. So what?
I completely agree with the comments that that James made, and it is a question that any person, any CX professional, should expect to come up in a meeting or a presentation and like I can accept back and ask why? Why is that? Why should you expect it to come up? And and I think the reason is this is somewhat of an innate trait of humans, that we want to be able to compare ourselves to each other at the individual level. And like, why is that? Well, we you know, you mentioned it earlier. Like we want to know are we know are we are we doing okay? Um, so there’s something that’s like self-definition idea around benchmarking that really is going to be prevalent in, in all walks of life, but in the CX world, um, and especially I think, when we’re in an executive setting for, for a presentation, um. The C-suite is very used to being able to compare themselves to, particularly if they’re publicly traded, to other companies on on financial metrics, and have a natural expectation that that’s going to be the same when they get introduced to a customer experience metric. I mean that out on the Gulf Coast, comparing that their golf swings to other CEOs. So I think it’s it’s it’s a natural thing. And and the challenge is that there are all kinds of pitfalls with benchmarking in a customer experience setting that make it incredibly difficult to get a true apples to apples comparison. From a financial standpoint, it’s very easy to get that direct comparison. It’s very hard. Data is inherently soft data. So that’s that’s one of the challenges that I’m sure we’ll talk about many more challenges.
No, but that is a very important one. I think this expectation of what a benchmark is for a CEO and for, you know, somebody in charge of an operational function, you know, those things versus what CX can produce from a benchmark standpoint. Right. And so from other parts of the org. So let’s think outside of CX. When somebody says benchmark, what are they maybe usually thinking of or what are they used to seeing, um when a benchmark comes up as a topic?
Yeah, right along with what Sean said, my mind goes right to fortune 500, right? How are CEOs comparing their performance? Right. And they get called on what two things: profit and loss. Right. So when when you’re talking in C-layer, you’re talking to fortune 500. But in in my experience we’ve also worked with organizational leaders like head of customer support. Now they’re benchmarking is very different. Average cost per call, average wait time. Right? Very different set of benchmarks. They are again, aligned, as Sean said, to the business objectives. And if you point benchmarking in that way and again you frame it correctly, it becomes a standard by which people think about they already know a benchmark. How are you introducing the concept of CX into their head of benchmarks?
And it takes me back a little bit to, you know, your earlier comment on this kind of performance versus process, you know, benchmarking. And then a lot of cases, those aren’t as maybe different. Well, and correct me if I’m wrong, it may not be quite as different for somebody in like an operational standpoint where, yes, the the performance of cost per call or average wait time or things like, yes, it does vary. Like you can’t just compare that to any other company in the world, but there’s going to be companies like yours that you, in a lot of cases, will have some insight into, you know, what is a good, you know, metric there.
Yeah. I think the, the, the underlying, um, kind of theme is, is with the concept of a benchmark is, um, as an organization, we aspire to be a leader and we, we need some gauge of how far away we are from, from reaching that. So, you know, it could be benchmarking manufacturing excellence, um, operational excellence, financial performance. But that that’s kind of the concept. Um, and so it doesn’t even necessarily mean, um, that they’re benchmarking within their industry. Like we, you know, maybe we want to have the same manufacturing excellence as an organization in totally different industry that’s been held up as a leader. Uh, or the other concept with benchmarking is it can be completely within the organization. We want to benchmark call center A or call center B or business unit one to business unit two. So it can it can be done internally as well with or potentially with less of the cons that we might be talking about later than an external benchmark.
Mhm. Yeah. I think that’s a really interesting point I was going to bring up later if it didn’t come up before, but it has. So let’s, let’s jump on that a little. So you know we do talk about on the CX side. And we’ll get into some of those you know additional issues you have on the CX part. But this ability to have objective benchmarks can be really challenging. There’s a ton of things they’re going to affect scores obviously. And even you know response rates and other things like that that matter as to how you’ve set your program up, etc.. Um, so this idea of where do we go about to get benchmarks, which we’ll talk more broadly about. But Sean brought up this idea of internal benchmarking as. Yeah, so talk a little bit about one, the value maybe of that to CJ. And and then if there are some, you know, kind of cons to that internal benchmarking.
Yeah. I mean, so some of the, uh, pluses of doing the internal benchmark one is you obviously you have that data, whether it’s access form of data readily available. Um, and so some of the, uh, downsides of external benchmarking just don’t apply. You should have good data, uh, across the organization. That’s still an assumption. Maybe you have wildly varying response rates, but you should be able to control that a little bit more. And I think critically, what you’re focused on is improving your own performance. You’re not kind of chasing somebody else’s scores. You are trying to improve. You know, we’re assuming that you have, uh, you know, intent. You have goals or you have purpose all around customer experience. And this is one of the ways that you’re trying to get the organization across product lines, business units, regions, whatever, all rowing in the same direction. Um, and so it is a commonly used form of benchmarking for customer experience data. Some of the potential pitfalls, one big one. And I think it’s just so tempting because that data is readily available that, um, this central CX organization is just going to start comparing things left, right and center. Um, but there has been, over the decades, a lot of research that suggests you should not compare scores results across different countries and regions, because there is a phenomenon called cultural bias that will make those comparisons misleading.
So that’s one potential pitfall you really need to benchmark within a region, ideally within a country to avoid that. Um, then I think another pitfall and this more on the business to consumer side of the house is that you might be comparing apples to oranges, even if you’re comparing different stores that you are or different hotels that you own, because there are environmental factors that make it maybe not fair to compare. For instance, you know, one Courtyard Marriott that’s just been, um, built and has all the latest and greatest features, USB-C chargers on the bedside cabinets, the whole thing to another Courtyard Marriott that was last renovated 20 years ago. Um, they could be in the exact same, uh, state or even city. So I think those you have to really make sure that we’re not creating an unfair comparison because of those environmental factors. Um, but that that applies more on the consumer side. And then, of course, the cultural bias is another, uh, just thing we need to be aware of when doing these internal comparisons.
Important points though again that internal, external, international. Right? Those have to be called out specifically because they will influence not only the results but what actions the business will take. And there are multiple dimensions you have to encounter when you’re looking at benchmarking in those areas. And again, as Sean pointed out, if you set a standard as to why you’re doing it and what this reflects, much more, um, chances of success, rather than throwing a bunch of numbers up there and saying, this is where we should be,
We should have seen, by the way, where you know, that quote,
Oh, yeah.
The other ultra compare, uh, matter of fact, I had an instance where I had an executive turn around his laptop and show me a competitor’s NPS during the executive meeting and say, why aren’t we this score with no, no other relevance at all other than a marketing promo on someone’s website? And there was my benchmark. Make that score. That was it.
And the benchmarking exercise.
Which is, you know, you kind of laugh at it, but it’s not too far off from what some…
…stakeholders or even some CX leaders would think is the…
That’s right.
…right piece. Like, hey, Let’s…
But again, No, there was no relevance.
There was no setup. There was no nothing. It was here’s a number. Go, go get that.
Well, that… so that kind of leads into this piece. You know, I often think about benchmark. Well, I often think about benchmarking period. Unfortunately. But but when I think about it, this idea of, um, you know, I think we use this terms just generally, but the realities are there’s especially in CX, I think two sides to this, and one is very much more of a what I call a normative score comparison, which is really a little bit of what you were pointing to James, even though that was specific. But this idea that, okay, well, let’s get a, you know, an average score in an industry or an average score worldwide or, you know, it’s just a comparison, a score to a score. And then we’re like, hey, we’re above that, we’re below that, we’re right at that. And so, you know, yay! Boo! Great. There’s some value to that. But really benchmarking, if you think about that as what it actually is intended to be, what it’s defined as, it’s the process of comparing your operations and processes against others, either within your industry or outside at times, to to really learn about other practices and processes that you could take inside and improve your own processes like that. You know, the idea of benchmarking is, yes, it’s about a comparison, but it’s really more about a comparison to a process or to something that you can do in order to improve what you’re doing. It’s like, how do we get better? So, you know, with that in mind, as we think about CX it’s, you know, how do we think about this difference and getting people maybe starting at the normative score, but getting them to think about the improvement as being the key.
Yeah. What a concept in CX to take action,
Right? I mean, you know, we laugh about it, but here is the real difference. And I love your norms versus benchmarking. Right. Because that is really the difference when you come right down to it, to a successful CX program. Companies that take action and show value to their customers and improve their bottom line and revenue are successful, the companies that compare themselves and look and go, yeah, that’s interesting, and then don’t understand what actions they need to take to drive that level of success. And you and Sean have brought up two important points, which is one, compare yourselves against people outside your industry. I know I’ve been in high tech forever, and God forbid we ever ask a question outside of high tech because nobody’s like us, quote unquote. Which is nonsense, right? But then there’s the also component of, you know, what are we really doing this for? Is it just for the pat ourselves on the back because we’re better than everyone, or are we doing it because we really just want to want to be the best and we’re going to commit to action. So really important point.
Yeah, I think the, the, uh, you know, just the absence of what I would consider valid, reliable benchmark or normative data, I mean, both really is is a huge challenge and a distraction. Uh, for a lot of CX, you know, organizations within that company, to this day, it’s really there is a role for benchmarking from a strategic and process level within CX. And, you know, don’t reinvent the wheel. But take a look. You know, the very beginning of of a CX initiative. You know, we might recommend a maturity assessment. Well, the maturity assessment inherently has a comparison to more of a norm really than a benchmark. And that that’s a really good use of comparing yourself to others. Um, you might want to benchmark how do other companies my size structure their CX teams in terms of roles and responsibilities and headcount, so on? How do they go about actioning their CX insights and results? Now that’s the logical and that will lead to the, um, business impact that the James was talking about that is being widely proven over the years. So it is that obvious. Why don’t companies do it? Well, because it’s not easy. It involves organizational change and investment. Uh, and uh, and a and a focused strategy. What is easy is uh, by finding an external benchmark and then comparing your very first round of CX survey results to that benchmark, and then going into a complete panic because, uh, the numbers don’t look favorable for your company in front of the CEO and his direct reports. And there are a thousand reasons why those numbers will never be comparable. So we somehow have to avoid that trap, because once you put the benchmarks up there, it’s habitual. The CEO is going to expect them every single way if you have of new survey data. So it really is the responsibility of the CX practice within an organization to not overemphasize that need for benchmarking and comparisons at the very beginning, and to talk about where they will need to compare to what others are doing from a strategy initiative process standpoint, but not from a score standpoint.
You know, Sean, you just reminded me of something. Um, he said something very valuable there about aligning expectations around benchmarking. One of the things that we did that we found very valuable is we became friends with the operational people within the business. And what are they benchmarking? Right. So that when we presented benchmarks, it was now, again, on a relative level playing field, because CX was just another benchmark added to a list of benchmarks they were already familiar with. Right. So that was super important. It helped the benchmarking communication be successful.
Yeah. And that’s actually a really interesting and helpful perspective because, you know, end of the day, especially when you’re talking about, you know, the more, um, you know, functional interaction. So you think customer support, for instance, is a great one. Like at some level, what we want the CX data to be is almost a this view of, okay, what we’re achieving operationally is meeting customers needs. Right. Because that’s the piece that they’re always missing is hey, we’ve got this great, you know, average wait time and this really good cost per call and whatever. Well but is that delivering the value customers want. And so if you view it that way this view of yes like it is or it isn’t, um, that becomes a valuable piece. And we’re not exactly caring so much about how’s that score compare? Um, so thinking about, you see CX less as its own metric that has to be benchmarked and more as a check act of our, our operations and processes achieving what they should. Are they providing the value to customers that they desire?
There you go. That’s it. That’s it. Troy. That was exactly it. If you can answer that question through a benchmark and not get caught up in quote CX benchmark, then your future is much brighter. Because now you have an or a set of organizations committed to do something with the information they received.
Well, let’s bridge into we mentioned a few times. Now you know the issues that exist with comparing to some score, some CX, you know, norm or benchmark from an external source and using that as a main comparison point. James and Sean, and we’ll start with you. Sean, you kind of mentioned a few already, but what are some of these issues, problems with these scores, regardless of where you get them, that a CX leader should be aware of and pay attention to?
Yeah. One of the the biggest is just the two sources of data that you’re typically comparing. So you’re typically comparing your scores, your results from feedback that you collected from your customers who opened up an email or took the survey. However you sent it out knowing that it was coming from you. You were clearly identified as the requester of that feedback. And that’s great. That’s perfect. That’s why you’d expect to get a good response rate, because people should be motivated to help you improve if they see improvements further down the road. The challenge with, uh, I would say the majority of the external, um, CX benchmarks that exist is that they are collecting data, uh, in what’s called a double blind manner. So they do not identify who the survey is being, um, sponsored by. And then they send the request out to what’s called a panel. And a panel is basically a group of people that, for some crazy reason, have opted in to take surveys so they do not know who sent them that survey request. And usually they’re not just asked to evaluate one company or brand, they’re going to evaluate multiple. Brands, vendors, companies in one survey sitting, um, and there are some data quality issues that people should be aware of if they’re looking at that kind of data, it tends to, uh, first of all, not be identifiable. So you’ll never know who provided those responses that you’re comparing yourself to in the, uh, boardroom. Uh, and you’ll, you’ll never really know how, um, considerate they were as they were going through the survey in terms of thinking about the rating that they were provided.
Uh, not really incentivized because they don’t know who’s going to see the results to go through it very thoughtfully. In fact, the opposite is usually the case because they are incentivized in a monetary way to complete the survey. So speed it’s it’s of the essence if you are a panelist. So all of that being said, um, there are significant pitfalls trying to take your NPS score that you got from your survey with your customers answering to any kind of external CX benchmark. Um, and I would, you know, just say it is it is apples and oranges. It will never be apples to apples. Now, there are other ways. If you want to understand how your customers think about your competitors, you can actually ask them. That is another approach. And there’s a school of thought that says in this, in the CX industry that that is actually in some ways more insightful than just asking them about your own product or service, particularly if it’s an industry where multi-sourcing is prevalent. Uh, because nobody makes a decision in a vacuum and there’s a share of business that is going to you, but a share of business from your same customer that’s going to somebody else. And so that same share or relative approach can be applied to customer experience and yield significantly more insights than just asking them to evaluate you. So that’s getting deep into some methodological discussions that we probably have time for today. But just to say there are ways to understand what your customers think of your competitors that don’t involve using an external.
I 100% agree with Sean. You know, be aware of the pitfalls, understand the data you are looking at and how it was collected, and again, how you’re going to use it. Um, one of the pitfalls I’ve seen also, and I think you both mentioned it, is this obsession with benchmarking at an executive level. Um, they have to compare themselves somehow. And when CX is introduced, there’s often a disregard of the validity of what they’re looking at. And there’s a lot of conclusions. They come right away. They’re not accurate. So I think again, setting up what are you looking at. What does this mean. You know, is is a really important part of this exercise because otherwise you’re going to be caught in that never ending cycle of having to produce benchmarks that may not represent your business or the performance you’re trying to obtain. So beware, as Sean said.
Okay. So this, um, next question for James and this might be one that would be good is maybe bonus content perhaps or something. But I do want to ask you, you know, specifically about the, the B2B benchmark that you did at Qualtrics in that report because I think, you know, one, it did… There’s always this piece of, you know, hey, we’re resistant to give scores and benchmarks. There are so many problems, but people still want them. And so this idea of, all right, well, let’s do that. Let’s do that in the best way kind of possible to get some comparison points, um, which you guys did. But I also really like what you did was looking at kind of the relationships that areas of for improvement, what drives, you know, the, the outcomes, what’s driving NPS. You know, so this view that okay yes, scores, but really again benchmarking is you know should be more about how do we improve and what are the things that matter. And so anyway, walk me just a little bit through kind of your mindset and the team’s mindset as they were developing this benchmark, you know, study and report.
Yeah, I see, you know, um, we talk about controversial topics. I think three letters are no more controversial in this industry than NPS. Um, but look, I think, um, our purpose really was one to try to give a B2B, um, companies, uh, a set of benchmarks they can compare themselves in a performance way. Right? CSAT, NPS, willingness to continue to do business. What was interesting about the study is when we started to look at the results, while NPS, and CSAT benchmarks, were interesting, what we really wanted to do is what drove those scores. And so we looked at every area of the journey map, right? Um, operations, finance, product performance, support, renewals, sales, all of that. And then we looked at other considerations of drivers of NPS and CSAT. And what I thought the most interesting part of the report was the results of the drivers of those key indicators of performance. So in your sector, we did, I believe, nine industries. In those sectors what drove the performance of your business? In some sectors, it was renewals had a very important impact on overall CSAT and NPS. In other industries, renewals had zero impact. Right? So it’s interesting to see what industries should focus in what areas of the business and why. And then how do you compare to your competitors. So that was our mindset. It was yes, we want benchmarks, but we really want why do they matter? Why do those scores matter? And what can you do about it.
Yeah, and I think you’re that piece of what we compare to I think becomes important. Right. So our internal score to an external benchmark isn’t like to Sean’s point isn’t always the greatest. If you can get, you know, your score versus a competitor’s score in a, you know, objective manner that becomes useful or valuable, but that’s very costly and isn’t really available in a lot of places. So, um, okay, so we are getting short on time. There’s still obviously a ton we could talk about on this topic, but I did want to make sure that we got to some take home value for the listener. And there’s really a lot of take home value that’s already been discussed. But, you know, if you had to maybe summarize, uh, you know, this down into 1 or 2 sort of statements of something a CX leader can really do and take from this on the topic of benchmark to, you know, help their programs and their companies, what would that be? So, uh, Sean, let’s hit you first.
Yeah. We’ve talked talked about this kind of throughout the conversation today. But I think in summary, what I would say is start by taking a look at yourself, look in the mirror. And before you compare to anybody else outside of your organization, think about how to, um, improve your current position as it relates to to customer experience. And when you do look at the role that comparisons to others can play, instead of focusing on the, uh, survey scores, focus on where you could compare yourself to what others are doing really well in terms of their overall CX strategy, initiatives and processes related to that. That’s what I would say is is the best starting point. And as as I mentioned before, if you start by positioning external benchmarks as critical to your CX program, it will be very difficult to back down from that position in the future. And and it’s not a position that we would recommend as far as that is.
Yep. Excellent advice. James, your take home value?
Yeah. Whether you’re presenting, benchmarking or any other kind of CX data, I always think of of what I call the three stages of data grief. Um, the first is, um, your stakeholders will attack the data. Then they’ll attack the methodology and then they’ll attack you. And once you get through those three stages, then positioning, right, over communicating, as Sean said, what are we looking at and why is not a general comparison of performance. And that’s how we should set the stage. It is a directed effort to establish a platform of comparison and to take action based on that to drive business improvement. And as we be both said, ultimately, the goal is to produce the best experience possible for the customers in a thriving business.
Well, James and Sean, thank you so much for all of your, uh, wise and sage advice for our CX leaders on this topic. And if you want to talk about anything you heard on this podcast or about how Walker can help your business’ customer experience, or you have a conundrum that you want us to tackle in future episodes, please email us at podcast@walkerinfo.com. Remember to give The CX Leader Podcast rating through your podcast service and give us a review. Your feedback will help us improve the show and deliver the best possible value to you, our listener. Check out our website cxleaderpodcast.com to subscribe to the show and find all our previous episodes, podcast series and a link to our blog which we update regularly. The CX Leader Podcast is a production of Walker. We’re an experience management firm that helps companies accelerate their XM success. You can read more about us at walkerinfo.com. Thank you for listening and remember, it’s a great time to be a CX leader. We’ll see you next time.
* This transcript was created using an A.I. tool and may contain some mistakes. Email podcast@walkerinfo.com with any questions or corrections.