Hi, everybody. Welcome, and thank you for joining our relevance roundtable for service leaders. Sorry about the technical difficulty there. Clearly had the wrong deck. It's happens. But we're amongst friends. So if you joined us before for our relevance roundtable, you kind of know what to expect. But if you don't, I'll reintroduce the team here, myself, Tracy Carson, who works in the global marketing team. Today, we are speaking with Alex Van Fossen from Aviva, o OSIsoft. I almost nailed it there, Alex. Sorry. He's a knowledge engineer and a force to be reckoned with when it comes to all things KCS and search at his organization. And pinch hitting today for Bonnie Chase is Ari Hoffman. So this promises to be a treat no matter, no matter where the conversation may take us. I'm extremely grateful for his time. He was able to put it in us in his calendar today. So as I said, some of you may have joined our roundtable before. And if you haven't, just wanna introduce you to a few very simple concepts. This is, you know, it fills the gap. It fills the need. When we can't get together for coffee at TSW or a consortium event, these roundtables are really supposed to be super interactive and give us safe spaces to talk amongst peers, who are really willing and able to share their knowledges their knowledges, their knowledge and their experiences, and we wanna hear the good and the bad. And and Alex is here to help us do that today. We want you to challenge our experts, so get ready, both of you. And we also want you to walk away with ideas. So raise your hand. You can do that through the Zoom interface. When you do so, I can take you off mute. I can even put you on camera if you're feeling brave. But we do want you to participate, so I hope you brought some great questions. The session's being recorded. In twenty four hours or so, you'll get the copy back in your inbox, and you can feel free to share it with your team and invite them to our next one, which we would love. And with that, I'm gonna pivot right into the content because we don't wanna waste a minute. So two weeks ago, Alex joined us for a session with John Ragsdale and the TSIA. Some of you may have seen, that session as well, and we invited you to this session as a real follow on. For those of you who didn't make it, not to worry. We'll get you caught up real fast on, all the things that Alex discussed, including knowledge management strategies and effective self-service experience, and also just some of the proven methods that he's applying with his team today. We had a ton of questions from the audience, so it seemed only natural that we would invite Alex back. He's not quite sick of us yet. Thank goodness. And we're gonna put him through the ringer. I promise it'll be gentle. But lots of those questions. We wanted to make sure we gave as much of a broad wide berth and and lots of time, for for discussion with this particular group, knowing how active and engaged they've been with us on our roundtables before. Alright. I'm getting out of the way. Ari, Alex, you're both off mute. You're ready to go, and, we're gonna get the conversation started. And there we go. So leadership support, we all know how important it is to get buy in from leadership. But I think the bigger question really here is how do you leverage that support on an ongoing basis? How do you continue to get maximum value out of that support, Alex? Mhmm. Well, we, we were lucky in that, the head of our customer success department had come from Oracle where they've already launched KCS. And so she was, from the get go, a big proponent of KCS, and everybody else kinda had to catch up. So what what I found is that you kinda wanna go from both ends of the spectrum. You wanna start with your, pitches to the front line who's gonna be doing all of the work, which KCS is based on. You you need their buy in. They need to understand what you're talking about, and then you kind of work up towards the managers from them and down from leadership, to the middle managers as well because nothing motivates somebody like knowing, well, my boss wants this to happen. So if you can kind of, pitch it to the the front line first, you'll find a few people who are interested, latch on to those people, you get them to help you out. Then you kinda work your way up to the managers, and then, you also work down through your your top leadership from your regional directors, onto your middle managers. Just trying to, at first, explain to them what it is, what the what the expected benefits are, getting them used to the idea. And, again, as you're going through this, find people who get it first time and wanna help you out, and then you sort of grow the support from there. As more people learn, they tend to, embrace KCS. You know, you you bring up a good point about, you know, making clear what those objectives Barca, and nothing paints a more clear picture, right, than the the metrics that matter. And so what measurements do you find the leadership is really looking for to keep apprised of things that are unfolding? Well, I'm I'm gonna give a partial answer. The best answer is on the KCS best practices on the course consortium's website. So go check that out. They have a list of all of the metrics you wanna follow. Other resources I've used are from FT Works or, David Kaye and Associates. They have blog posts on this as well. So what we did at OSIsoft is we looked at time to resolution and how it was going down. We monitored basic things about the knowledge base like, are we seeing the number of articles go up? What does our creation rate look like? KCS is really big on something called the crossover point, which is the point at which the number of cases solved with, existing articles crosses over with the number of cases solved with new articles. So when you hit that mark, about fifty percent of your cases are being solved by something you've already have in your knowledge base. And at that point, the program starts to snowball because when people are solving, their cases in sort of, say, half the time because that knowledge already exists, they start to see the benefits of KCS, and then they're more motivated to participate in the program. So we looked for that crossover point first, and then we looked for other metrics that that were more directly related to value, like time to resolution second. Got it. And, you know, obviously, when you're talking about that snowballing effect, the the program is maturing. So as that program matures, do you have a sense of what else they wanna know as things continue? Yeah. I would say you're never done collecting metrics. We've been doing KCS for about two years, and the metrics people wanna keep changing, which is not to say people are fickle. It's just that as you learn more about your program, the number of questions you have also change. So we started with time to resolution, and stuff we're looking at next is, like, self-service. Are we are we seeing we're seeing people come to our website. We're seeing them do searches, and the click through rate is high. So can we dig into that? Are those is it click through rate high because people are click happy, or are they actually getting value out of what it is they're clicking on? And then we also wanna look at okay. Well, self-service, we we know that we're gonna answer more cases with self-service than without. But how is that affecting our bottom line? Like, are we providing better service with self-service combined with cases? And how do we judge that value so that we can properly invest in self-service the way we need to? That makes sense. And and, you know, having leadership support, you know, that top down is always crucial, but culture matters. And so how do you use the company's existing culture to kinda create a readiness for KCS as you Right. Right. So at OSIsoft, we, have always had a pretty good, support culture, I'd say, as long as I have been there. We live by the mantra of our CEO or former CEO now, of support the hell out of our customers. So I think that's absolutely critical is this idea that the reason your organization the reason why tech support exists is to better your reputation amongst your customers. If you don't have a good reputation, they're not gonna buy your stuff. Right? We all know this. It's it's obvious. It's just that when it comes down to budget meetings, it's really hard to put a dollar amount on reputation. I know there's been number of surveys and metrics that have been tried to do that. But I would say you kinda wanna just make sure that that your reputation metrics are good and just assume that that is kind of leading to dollar value down the road even though you don't know exactly how much that dollar value is. So we are constantly monitoring c stat. We are, gonna be looking at net promoter score for the whole company as well and making sure that, those numbers are healthy or improving. So I think within the first year of implementing KCS, we saw, c sketch we saw a c sketch, score jump of, like we went from, like, four point seven to four point eight five something, which was the highest we've ever scored on c stat for annual review. That's amazing. You know, jumping from the company culture over to something that I've heard over the years again and again, which is really getting buy in from your support engineers into working this into their workflow and becoming those content creators when they're sometimes can be measured on things that could be conflicting towards their KCS readiness. You know? So do you have any tips to how you kinda get that buy in from your support engineers and overcoming some of those objections they might have? Yeah. So, first and foremost, you're not gonna get, like, everybody to buy in right away. And that's okay. That's normal. There are gonna be some people who are thinking about the big picture already and trying to find sort of some way to get around the big problems that your organization is facing, they're gonna hear about KCS, and they're gonna get it immediately. And they're gonna go, this is amazing. They're asking you a lot of questions about, you know, how does this work? They wanna know more information, and they're really excited. So latch on to those people and get them to help you. The second group of people who are gonna be pretty vocal are people who are gonna be like, there's no way this is gonna work. They're seeing it as, you know, this this is gonna be a lot of effort. There's no way we're gonna be able to maintain, consistent quality. The people you're asking to write articles don't know enough. You know, these are probably gonna be concerned back line engineers who wanna make sure that we we aren't putting all of our eggs into a basket that's not gonna float. The metaphor was a bit mixed, but I think you get the idea. So Yeah. Those people are also people you wanna latch onto. Right? You're gonna use what you've learned about KCS to sort of address those concerns. You may need to shift their thinking. But if you can bring them over to your side, they will be stronger advocates than even the first group of people who are excited to begin with. Right? So once you get them on board, then you you usually have enough people participating that your knowledge base starts to grow. And I'd say the this is a rule of thumb. This isn't perfect science, but I'd say about twenty percent participation from your frontline is what you need to start seeing the growth to maintain your KCS program. And, again, if you can hit that crossover point, then you're gonna start, snowballing. He's supposed to be napping. Hang on one second. Is that technical support coming in? That's all good. That's all good. I wondered I'm gonna ask the audience. I put it in the chat as well. How many of you have reached that crossover point if you're practicing KCS already? Do you have that in your already in sight, or is it something that's happened for your organization? Feel free just to raise your hand. No big deal. Alex is back, so I don't have to fill too much more time. But do let us know, and and like I said, keep on keep an eye on that chat for some of those questions, and we have lots of time for q and a as well, so don't be shy. Where was I? Yeah. It's all good. It's all good. I mean, I I you you had covered about that twenty percent buy in, right, to really Right. Start to get the measurements. And Right. So the the hardest hurdle when you're starting KCS is building up that knowledge base. So when your when your engineers are being asked to write an article after every single case, what I found is that most of them will write articles after some of their cases. And so you're still building the knowledge base, but you're not doing a hundred percent case yes at that point. But as long as you're getting about twenty percent participation, and, again, that could be, like, twenty percent of people doing it right a hundred percent of the time or it could be most people doing it right some of the time. Then you can build the knowledge base up to the point into the crossover point. And at that point, it frees up a lot of their time to work on the new articles because most of the cases are being answered with existing articles. Right. So Yeah. That's how you get up to, like, eighty percent participation. And we talk about our support engineers, getting that buy in and and really supporting the program. But, you know, I know at Coveo, our CS team uses KCS. They're all certified, and Mhmm. It's incredibly valuable. Do you see other opportunities in your organization where KCS can be leveraged? Oh, yeah. Absolutely. So KCS is, it was it was designed for tech support, but it works in any situation when you have knowledge that is best classified as frequently asked questions. Right? So when people are are trying to get information from your company, That could be tech support. That could be customer service. That could be marketing. You wanna make sure that people understand that when they get a question they know the answer to, they can document that somewhere. Doesn't matter what department they're in. So some of the departments we've looked at, trying to spread this idea to would be our own internal IT, our HR department, our marketing team. Yep. It's a it's a slow process, but we continue. Yeah. You know, as it's been recommended, you know, from DBK, you know, David Kaye, from the consortium themselves, it's always important that you have at least one champion that's leading the effort. So, you know, what I would want to know more is who makes a great champion? What do you look for? That's a great question. I would I would say, I I would say that if all of the champions I've seen, they're really sold on the idea. So I would start there. If you have somebody who is sold on the idea, they're like, this is gonna change everything. We absolutely should do it. That's a good person to champion, the cause. Other other traits, I would say, are if they understand the support organization really well. I came up through frontline and then moved into knowledge management. So I had both an idea of what our frontline engineers were going through and how the knowledge base worked on the back end. I can combine those two, sets of skills to understand where we needed to go next to solve some of the sort of more persistent problems with our knowledge management strategy. Communication was something I had to do a lot of. So if they're a good communicator, great. That's gonna help them out a lot. KCS is going to take a lot, essentially, an over communication amount to really convince some of the the holdouts that it's worth doing. And then they need to have some amount of skill organizing, people to work on projects to move the whole program, forward. That was probably my weakest area starting out, but I took a class, and I'm doing better. Got it. And and that kinda rolls over into, you know, who a good coach is. But before I get to the coaches, you know, of the people who leverage KCS in your organization, you know, where do you find the roles that are most heavily participating in that? I'd say everyone in the organization participates in KCS at some level. I'd say the the heaviest participation probably comes from the frontline. They're writing your soft loop articles. They're they're both the largest part of the organization and making the largest chunk of the knowledge base. Next up would be your back line engineers. Right? They're vetting stuff. They're looking at the knowledge base as a whole. You're they're they are your knowledge domain experts. Right? They are trying to take all of this information coming in from the frontline and sort of trying to parse it to put it in the best possible, state for your customers. Then the managers are participating because they're looking at, you know, are things getting done when they're supposed to be getting done? Are we prioritizing things? Are we making sure people understand sort of the bigger vision so that when they're doing their day to day job, they're contributing to that bigger vision. So frontline, backline, then managers. And then our coaches are very heavily invested in KCS. They are a mix of frontline and backline engineers. They are the people that we've identified as being most excited about KCS and wanna help their peers learn how to do, the KCS methodology. I mean, you you're basically hitting on, you know, what I was gonna ask about coaches, but Mhmm. We understand kind of the role that the coach is playing in your program. But how do you identify a good coach versus just a champion? Right. So coaches are I would say coaches have three jobs. Right? They are the communication highway between the people making the decisions on KCS and the people actually doing the KCS. And that is their most important job. The second job is that they are cheerleaders. Right? They they are excited about KCS. You wanna look for people who are excited about KCS and go, you're gonna be a coach. So the third thing they do is they need to train and sort of assess the mentees. So you're looking for someone who can be a leader, but leadership from a peer perspective. Right? So they don't have any authority over their mentees. They are looking to help their mentee improve. So that is something that we've always known from the beginning, but we've had to revise how we communicate that to our coaches. Because initially, we were like, okay. So there's this AQI thing you gotta do, and then there's this BAR thing you gotta do. And you do that, and you figure out, you know, what your mentee score is. And then the places where they're not doing so great, that's where you coach them. Right? And that last piece was actually the most important, the that's where you coach them part. So they heard all that stuff at the beginning, a u AQI, arc p a r, and then they got to the and that's how you coach them part, and they're like, what does that mean? So we've had to develop a coach training on, like, how do you lead your mentee to set their own goals? How do you keep them accountable to their towards those goals? How do you encourage them, and show them they're making progress so that they are excited about doing KCS? And that is the part that took the most time to get right. Right. That that self ownership. Right? I'm really seeing the results and the buy in and wanting to take part in that. Right. You know, it's it's fun, though, at the same time. Right? You're really learning about human beings. You're it's a positive and reinforcing practice once you get into the swing of it. But let's talk about that as getting started. Now you you've done the prep work. You've got some buy in. You've spotted some champions. You've set up maybe some potential coaches, but getting it off the ground is another thing. Right? Getting that launch. And and I don't believe I think I remember that you didn't do a pilot program, when you launched your KCS. So, you know, what would you say the the pros and cons were to the approach you took? Well, let me first say, I recommend you do a pilot program. So, the the pros we were in a particularly unique situation. It well, maybe not that unique, but not a situation that I would say is described by the KCS best practices guide. So the guide sort of assumes that you are in a support organization and you're rolling out just the KCS program. Why would it assume otherwise? Well, the situation we found ourselves in is we were trying to roll out the KCS program, an entirely new case management tool, and a new knowledge based tool, as part of a much larger digital transformation that our company was going through. And, we're also going through several different sort of organization reorganization at the same time. So everything was changing all at once, and we sort of had a hard deadline for when we were gonna flip to the new tool. So we set our KCS launch at that same time and trained everybody about KCS at the same time we were training them on the new tool. So when we moved to the new tool, there was sort of, like, a no going back situation, which was good. Like, you there we were doing business differently, and part of that doing business differently was KCS. KCS. And since we couldn't revert to our old tool, our old methodology, we had to keep going forward with KCS. So that was helpful. I think if we had started the pilot program with our old tools, we would have struggled against the tools a lot, and we wouldn't have gotten that sort of snowball effects. People would have associated the problems we had with our tools with the problems with the methodology itself, and then they would have soured on the methodology. When the methodology was fine, our just our tools weren't capable of carrying it out. So that's that's one good thing that happened from not having a pilot program. The bad thing was we had a lot of questions. Like, we we knew a lot about KCS sort of in the theoretical sense. We had learned about it from the practices guide. We gone to webinars. We'd we'd learned from what other people had done. But actually rolling that out and making that policy was, well, no battle plan survives contact with the enemy. So we had to, sort of change a lot of policies. We had to rework some things that weren't working right in our tool and so I had to work around them. And I'd say the the biggest issue was just trying to maintain quality. Because we had sort of expectation from our back line engineers for a certain level of quality, and we were trying to lower that so that our frontline engineers could write these articles and keep them in a sort of work in progress state. And trying to correct those two different views about what quality should be was a constant battle for the first year. Right. Especially keeping in mind that sufficient to solve kinda Right. And, actually, I'm gonna step in sorry. Sorry. With the really good question. There's some there's quite a few coming in, so forgive me. But I thought this might be a good, a good one to interject here instead of later. Knowledge creation in into goals. I know, Alex, I think you may have touched on it back in the past webinar, but our our anonymous attendee, they didn't wanna put their name for whatever reason. I appreciate it. I respect it. Thanks for for throwing the question in. They said that they hear this could be disastrous and also trick the system to create knowledge for the purpose of just meeting the goal. So, basically, incenting bad behavior. Is that is that true? How did you avoid that? Or just that that has happened to other KCS programs or so I have been told by the person who trained. It did not happen with us. And I think the key is to make it clear to people that creating knowledge is not the goal. Right? Solving problems before they reach you on the phone is the goal. Right? And if if that if making it so that, you know, the next person who runs into this problem can solve the problem faster, whether that be your teammate who got a call from a customer, the customer themselves who might be considering giving you a call, or maybe you just need to remember what it was that you did through this article. That that is the goal. Right? So we were very clear on you get your case. The first thing you're gonna do is look for existing knowledge, and you're gonna use it. Best case scenario, you use it, you don't have to change it, and you're done with the case. Yeah. Right? That was the goal we were heading towards is that you get a case from a customer, you find the answer immediately, you give it to the customer, and you're done. And then as a part of getting to that goal, you you know, if you don't find an answer, you have to create it. So the next person who goes through this doesn't have to redo everything that you did. If you do find an answer but it's not quite up to snuff, you need to take what you learned and put it in there. So, again, the next person doesn't have to relearn what you learned. And so trying to make people think about the others, their peers at the company Right. And value their time so we could all work together to improve efficiency with something we really had to hammer home to get that to work. It's a really good question. So also a really good answer, but it's a really good question and something that people do need to think about. You know, it's the mentality. Right? It's the focus and the perspective that you're you're building into people. You're having to empathize with others. Right? I do know there are other questions. Do you wanna approach any of the other questions now before we kinda Yeah. I think we'll get to a few of them sort of in the flow that we've got here. One one that I think might come in mind, and, Ari, it's it's sort of in what we've talked about with Alex before around reaching that twenty percent participation. Yeah. So the question we may let's lead this to lead into that part of the conversation is, how do you sustain the training of coaches to get them to be able to continue, so and how often are you doing that? I guess, what's your coaching practice in parallel with all of these things going on at the same time? Oh, boy. That's that's a lot. Okay. So some some basic numbers is we try to have, a ratio of coaches to mentees of one to six. It ranges from about, one to four to one to eight. Your coaches who are men mentoring, your engineers who are, more advanced in KCS, they've been on the front line longer, they're gonna need less help. So those coaches will be able to handle more people because their mentees are not as I mean, your coaches who are mentoring new people who just got hired and everything is brand new to them, they're gonna need more help. So the coaches are gonna be more involved in those mentees' improvements. So that, yeah, that ratio of one to six is not a golden ratio. I've seen ratios I think Akami is doing, like, one to two and which is just boggles my mind. And And I think KCS, they might have changed this, but I think when I read it, it was one to eight. So we're somewhere in between those two extremes. So that's that's the number. So with a with a total, like, organization of about three hundred engineers, it's probably it's probably a bit high of an estimate. We have something like seventy coaches. Right? That's way more coaches than I can manage on my on my own. So we implemented something rather recently actually called coaches coaching coaches or super coaching because that's much easier to say. It's this idea that coaches coaching other coaches who are their peers. So we don't have coaches who are, like, at a higher tier. We just have coaches coaching each other, just like they would coach a normal mentee. And we tried to pair them up so that they're in the same under the same manager if possible, because that just keeps it simple, or in the same office, or in some cases, just have to be in the same time zone, or in some cases, not even that. They just have to have some other peer that they can meet with. Right? And that way, coaches can get feedback on, not only their own articles, but they can also see somebody else trying to coach them. And if they have the experience from the other direction, their own coaching will improve. It's like a coaching buddy system. Yes. Yes. You know, earlier and and what Tracy just hit on, which is that twenty percent participation. You know, I would assume from the crosshairs and what we've talked about before that you have hit that. But talk to me about what it was like getting there and then, you know, how fast you know, how much time did it take to get there? What can people expect on that twenty percent mark? Right. So we we actually were a bit behind in our measurements of the program. So by the time we measured participation rate, we were already at forty percent. So my best guess for when we hit that twenty percent mark was probably within the first quarter after launching the program. From there, we went we went, like, twenty percent to forty percent to sixty percent within the first three quarters. And I would say the the main thing that led that success was, the coaching program and being consistent. Right? As long as people were getting reviewed, by their coaches, they knew someone might see these articles and read them. They're not just being sent off into the void. They actually matter. So I'm gonna do if not my best job, then at least a good enough job that I'm not embarrassed about it. Right? So just having a coach led to that level of participation. Right? It didn't we I'm pretty sure it didn't matter whether or not the coach was any good because at that point, we hadn't rolled out any coach training. So just having somebody looking at your articles was enough to to motivate people to participate within the program. And then, I think the next big sort of shift that got people participating was the crossover point. People started to run into situations where they get a case. They'd know it was gonna be a hard case, but but they did a search not expecting to find anything, and then they did find something. And they were able to solve the case like that. And those kinds of wins start to pile up. People start to see the value of the program from those anecdotes, and that makes them more willing to participate because they want that to happen again. Right? Right. They they want that to happen every time. And then the third big thing that led to increased participation, something we called sufficient to solve, which is straight out of the KCS program. We didn't invent it, but we had to reemphasize to people that sufficient to solve really doesn't mean you have to write a gold diamond, perfectly cut gem of an article. You just need to get to a point where somebody can use what you learned and apply it to the next time the situation occurs. So resetting those expectations, help people be more willing to participate. And then the the third key thing we did, is gonna take some explaining. So when we first launched KCS, we were thinking, well, we'll be ahead of the curve. We're gonna split our knowledge base into two groups, validated articles and not validated articles. Our validated articles are gonna be validated by publishers just like the KCS program says, but we're gonna show both to our customers. And that way our customers can use all of our knowledge base instead of just the stuff that got validated. So that was a that was a big mistake. We should have launched with just the validated stuff going out to customers first and then expose the not validated stuff later when we're more comfortable with it. So we actually backtrack that a little bit. So now we use a state called work in progress, and it's slightly different than defined within the KCS principles. But, basically, the very first draft of an article that you write, we just assume that it's terrible. Right? Because all first drafts are. And we keep that hidden from the customer as long as it's a first draft. So only when it's used that second time, it goes through its first review, then we make it available to the customer by moving it to not validated. So we still have validated and not validated shown to the customer, and the the difference being validated stuff is reviewed by our specialists. Not validated stuff is just what we have available right now, use at your own risk. But that the especially rough stuff, that first draft stuff, that's actually hidden from the customer now. Yeah. It it's you brought up something that I actually hadn't heard of before with with exposing the unvalidated in that method of launch. So you definitely learned that one the I shouldn't say the hard way, but it it gave you a lot of experience. Right? Yeah. I I mean, it was the hard way. Like, that because that that one decision led to the most consternation about the KCS program that we have, which was how do we ensure that the quality we're showing to our customers is sufficient. If we're only being sufficient enough that, like, like, we're trying to meet the bare minimum of this content standard, we're trying to get it out as fast as possible. So if we had just said, okay. We're trying to meet the bare minimum. We're trying to get it out as fast as possible, but we're not gonna show it to the customer. That would have eased so many people's minds if we had done it that way first. Yeah. Alright. So we've we've gotten started. Right? Now it's about prioritization. Big rocks, little rocks, keeping everything in alignment. And, you know, from our interviews, from your TSI, you know, a session, something that Tracy and I, it's both music to both of our ears, obviously, but it's something that you keep saying, which is search at the center of the experience. Oh, yeah. We love hearing that. So why is that so critical? If people can't find your stuff, they can't use your stuff, and it doesn't matter how much stuff you have, you've your whole project will fail. Right? Because KCS is based on this idea that you search for an answer, you find the answer, and you use the answer again. If you can't do that first two steps, then your whole KCS program is gonna fall apart. Right? You have to have both the knowledge base and the search experience to make this work. So when you're actioning kind of content gaps through the one knowledge base to another knowledge base through that initial launch, how do you continue that? How do you manage that? Right. So we, so we did something wait. Hang on. Let me back up. Think about this. Right. So we we did something against KCS guidelines, which was we we took our old knowledge base and we shoved it into the new one. KCS does not recommend you do that because they're they're sort of assuming that you're starting with the Internet knowledge base to begin with and that it's kind of so, old and broken that there's not really anything valuable in there anyway. And if there is, people will go look for it, and then at that point, you move it. So we kinda did a hybrid of that approach where, we took our Google Analytics and determined which knowledge articles our customers were looking at preemptively moved to those so that they would be available, when we went live with the new system. So we didn't grab everything. So where we use Coveo and the content gaps analysis is, instead of redirecting users to the new KB article page, which might not exist, We directed them to a Coveo search page using the KB ID number as the search term and then filtering on KB articles as a content type. So with that very specific search, we were able to, comb through the content gap, and out queries and look for, you know, articles that or queries that had, you know, this specific format of they're looking for this number and they're filtering on this. And so we could then determine which articles our customers were looking for that we hadn't yet moved. So we use that to determine, okay, we have to preemptively move these KB articles rather than waiting for our customers to call us and tell us something was missing that they wanted. Due to, totally unrelated issue, turns out customers going to our community site would click on a link to go to a KB article, and then they would get redirected to this Coveo search page. But because the credentials weren't transferred right, it would just show them a blank page. And so we continue to get complaints about people not having KB articles that they could use when we were very confused. Because if they ever told us what the KB article was, we'd go, but that is moved, and it's right here, and click on this link, and now it works for them. What's going on? And it took us, like, nine months to figure out that was the problem. Wow. And at the whole time, I'm looking at this content gap analysis in Coveo, and I'm like, this this doesn't make any sense because we're not seeing people looking for things and not finding them. People are finding them when they search for things. Well, if your page never loads, then you're not gonna get any analytics on the page not loading. Right. That makes sense. I mean so you've talked about different types of locations of knowledge bases and where you're bridging to, but you also have varying, you know, content source formats. You know, what was your strategy around different content sources? Mhmm. I wanna say it was pretty out of the box. I mean, we just hooked it up into Coveo, and for the most part, it worked. And so our strategy was just put everything in the same search engine, and it doesn't matter what it is. The Coveo search engine will find it for them, and then they can get to it. It was a bit more complicated than that. Some stuff had to be moved, because of the new tool and how it works. So there was some complexities around, getting, like, our product downloads to show up right. We had some older tools, like our user guide documentation that was online used a very, very old tool. And so we had trouble indexing that, and I think they're just about to fix that this month. Yeah. I think they're launching that this month. So but other than those two situations, everything else is smooth and easy, getting it hooked into Coveo search engine. And then, again, as long as you direct people to that search engine, it doesn't matter where it is, they can still find the content. Right. So we prioritize things, the right things, the big rocks, and moved our little rocks. But this you know, it's a journey. It's a continuously evolving journey. And a question that I have that, I know that others must have as well, especially, you know, for companies that are either acquiring other companies, being acquired, is when you acquire a new company or they acquire you, how do you manage that content and how does it expand? So with Aviva for you guys, how do you handle this new content? And where do you start to align that and streamline that, you know, content experience? I might have a better answer for you in, like, six months to ask back then. So right now, what we're doing I'm pretty sure I can share this. It's just trying to figure out who who are the other people on the other side of the the merger that we need to talk to. Right? Who who is it who's managing content? What kind of content do you have? What are some areas of overlap we could consolidate into one tool? So I am still looking forward to meeting the person who is gonna get their life changed by me talking about KCS, but I haven't met them yet. So I don't I don't have any details for you on that question. Sorry. No. It's okay. I mean, have you done any work with any of the content yet from AVEVA? Or no. I think I did talk to one person about our newsletter and how they handle their newsletters. And so, one thing I found is encouraging is our our newsletter is pretty manually compiled. And I do that myself, like, once a month, and it takes me hours. Right? So I'm just like, I'm so done with this. Their newsletter is sort of automated. So they have announcements that go out to their customers, and those announcements have, you know, a particular product and a content type associated with them. And then they let their users sign up for those different contents or products and then how often they wanna get that. And then the the newsletter is just an email that's assembled with all of these links to the announcements they're churning out on their website. So that is something that I wanna do because that sounds like an amazing content management system to be able to do that. And they looked at our newsletter, and they're like, oh, it has pictures. That's cool. I was like Not as cool as automation. Yeah. It's a great photo As long as it's providing valid value. Right. Right. That that's it. There's a few questions I'm gonna jump in. We've got a lot a lot to get through, and we do have a few minutes. So let's do it. We have Sarah Melbain from she's a knowledge director at UKG. A a specific question about tying knowledge metrics to TTR and maybe explain what TTR means to you and why that important metric, matters to you, Alex, because I know you mentioned it before. That was the question. Sorry. I lingered long to that. What are the specifics of of tying knowledge metrics to TTR, and have you done it? Right. So TTR stands for time to resolution. Yep. The way our case management system works is, you know, customer contacts us. We send them back an answer. We mark the case as answered. So the time to resolution is the time from when the case was open to when we gave them that answer. In other words, the amount of the amount of time it took to research the problem and get to what we thought was the solution. So that roughly measures like, when you when you take the average of that for all of your cases, that roughly measures, like, how fast you are at getting back to the customer. So if you see a drop of about fifty percent, you're solving cases in half the time. So we went from about, when I say, eight point one days to three point nine days. It was about fifty percent, from and that was over a two year time frame. So if our engineers are solving cases, you know, in half the time, that's a huge efficiency gain. Right? And so what that means is we are instead of increasing our head count by fifty, sixty engineers, all we have to do is replace turnover at this point, which is Even as you're expanding with customers. Right? Even as we're expanding our customer base and going through mergers, all we have to do is replace turnover. So as our engineers go off to, development wherever they else they wanna be in the company, we just have to bring in a new hire. It can sort of fill that seat, and we're so efficient with with the knowledge base at this point, that we don't have to grow our headcount to meet our new customer base. You know, I I know there's a follow-up question, which I think is really important, which is, you know, how did you connect the knowledge metrics to that reduction in TTR? Oh, that is a good question. Right. Yeah. I'm trying to remember how we did that. I think I think I ended up using link rate. Yeah. I this was several months ago when I went through this analysis, so it's kind of way back there in the cobweb. I think we looked at the link rate and also the type of link. So if we saw the existing links growing, so that that is, cases solved by articles that existed before the case was closed or opened, I mean. That percentage went up to, like, seventy five percent. So three quarters of our cases are solved with our knowledge base, and the article existed before the case was open. And then something like ten to fifteen percent are new articles. So that's an issue we've never seen before. And then I think about eighteen percent aren't linked to an article. So either they're linked to some other documentation source or it wasn't really a question that could be answered with an article, and so we didn't bother to attach one or the person didn't do what they were supposed to do. So from those numbers, we can see that we could see that growing over time. So in back in twenty nineteen when we launched, we started with an existing link rate of about forty percent. And having grown it to seventy five percent, now we're seeing that our time to resolution has dropped dramatically while this has been climbing. So that to me, that means that if you solve cases faster with an existing article, it makes sense that those time to resolution gains were from the growth of our knowledge base itself. That makes all the sense to the world. And how about integrating KCS into the support escalation process? Was that done right away, or did you focus on a certain tier of your agents or engineers? Not entirely sure what this question is getting at. I think the the short answer is yes. We implemented KCS across all of support. Solve loop articles were the responsibility of frontline. So we made it actually, let me back up. So we don't pass cases from frontline to backline when they're escalated. The owner of the case remains the engineer who took the case. Right? So that I think the term for that is swarming. But we've been doing it before swarming was a thing, so we just call it normal. So when the back line engineer works on it, they actually take something called a collaboration request, which is a case within a case. So if a front line engineer has a question, that question goes to Barca line, back line answers that question, and then sends it back to front line. And they can see the case and all the details so they have the context for that question as well. But the the back line engineer is only the owner of that collaboration request. The ownership of the case and communicating with the customer actually solving the problem remains in frontline's hands the whole time. The ownership is not transferred to backline at any point. It might change hands between frontline engineers. That's a whole another thing. So with that in mind, the it made the most sense that if your article is the resolution to your case, then the article has to be written by the owner of the case, which is your frontline employee. So I I think that answers the question. But if you're trying to get it, like, well, well, what does backline do then? Well, that's that's where you're starting to talk about knowledge domain experts, and that's a ACS term. So our backline people know the product really, really well, and sometimes our frontline people get stuff wrong. So the way we've sort of balanced the product expertise of backline with the constant churn of frontline in cases is that if an article ever reaches the point where it's got ten cases linked to it, in other words, it's helped ten customers at this point, Well, then we know it has some real value. Right? Because it's effectively you know, it was the first solution and then nine people use the same solution. That's a lot of value there. So we send it up to back line, and they sort of prioritize who's gonna work on it, on their own. We leave that up to the backline group to determine how they wanna divvy that out, and they go through the process of validating it. So they check for technical accuracy. They check that it follows the content standard, and they put their validation stamp of approval on it, publish it out. And so now it has a validation, status, and it shows up on our website with a big green check mark. So I guess most of our, backline's job around KCS is going through that validation process. But we found that as they go through that validation process I'm sorry this answer is so long. When I don't know what you're asking, that's when my answers get really long. So the validation process also allows our engineers to sort of see, okay. So what is it that's, like, floating to the top that's being really popular that is new? So they can look at that and and sort of look at the scope of all of their validated articles they worked on and say, well, what is the underlying problem that if people just understood it a little bit better, they understood this about the product, we wouldn't be getting all of these different cases. Okay. So they're able to sort of consolidate what look like disparate issues into sort of these troubleshooting articles where you have this general symptom, then, you know, you might be running into one of these three problems. Here's how you distinguish between them, and then they'll link out to the solve loop article written by Frontline. So we got a we got a high five from the question asker in the in the process. I thank you for that, and thanks for the great questions. I don't want us to run out of time. So we have a few minutes left, Alex, but I wanna carry on with this theme of how product is recognizing these themes or emerging themes from the the support articles being written, the knowledge articles being written early in the process. How do you get, the explain those benefits of KCS and this knowledge practice for those who think these articles are just competing with good documentation or other other areas of content that are being served up, by the product team or even others. Okay. So that's kinda so that I think those are kinda two different oh, oh, I know what's going on. You use the word, product team different than how I use the word product team. So Okay. Let me talk about our documentation team Yep. Who writes the user guides for our products. So we this actually predates KCS. We actually sat down with that team. The knowledge manager group from export sat down with them and said, hey. We're seeing some overlap between KB articles and user guides. Right? Yeah. We don't want there to be overlap. We don't want there to be duplicated or worse, contradict contradictory information. Absolutely. So we sat down and created a list of, you know, these are the qualities of the user documentation. This is its purpose. This is the intended audience. And so, troubleshooting articles, we said, you know, this is its audience, this is its purpose, and this is how we're gonna handle it. And this is how you separate the two so you know which bucket they need to go in. Now there's still a little bit of gray between those two buckets. You're never gonna get rid of that. But generally speaking, your troubleshooting articles are problem, cause, solution. That's it. Whereas your user guides are like, this is how you install it. This is how you use it. This is what this feature means. This is what happens when you click on this button. Right? And so in that sense, there like I said, there's still gonna be some overlap because you might have a problem that you fix using user documentation. When that happens, we tell people to write up an article anyway, say, you know, you're experiencing this problem. Here's the cause. Go see this section of the user documentation to solve it, and we just direct them back to the user guides. And that happens a lot. Like, we have hundreds of articles that just link back to the user guide for the solution, but we keep those articles around because sometimes the the context that's in your user guide doesn't match the context of the user. Whereas with the article, we can use the context of the user and then link that back to the solution in the user guide. So they work handy. Yeah. You can connect the dots for them there. Yeah. Right. Well, I don't wanna, push everybody's calendars too far. I wanna say so many questions now. I know. We just keep going. We just keep going. We and I love the way that you you we we talk about product differently and, you know, there's only so much to unpack all these. But I wanna thank Alex for more time spent with his friends here at Coveo and then all of you for joining us today. We really do appreciate the participation. The chat was on fire. We had nobody raise hands, so I'm gonna keep you on point for our next round table, everybody. I wanna see more hand raising and off mute. But before we wrap, we have a learning series. It's probably more technical than, the folks on this call, want, to engage with. It goes pretty deep into Coveo, but it's all about search performance. So I think if you're curious or you have your team looking at Coveo or looking just to get the very best out of it, I highly encourage you to register for the session. Instead of putting a bad URL that you can't follow on in text on the slide, I've put it in the chat, and we'll be sure to send it along with the follow-up email today, or tomorrow as I promised with the recording. With that, I'm very happy to close out our latest edition of our relevance roundtable. Alex, Ari, thank you again. To everybody else, have a great day. We appreciate your time. You guys are amazing. It was so much fun. So much fun. See you soon. Bye. Hi, everybody.
Fuel Your Service Strategy With Knowledge and Search
You can look at your data in hundreds of different ways, but is it the right way? How can you deliver personalized, relevant content when your audience is so different and diverse?
When search needs to unleash information quickly…
People can self-serve and help themselves without the wait.
Knowledge and search are the foundation of a successful service strategy, especially as support channels expand and customer preferences grow.
Hear from Alex VanFosson, Knowledge Engineer at OSISoft for a more in-depth conversation about how to understand customer behavior, agent proficiency, and measure success.
Your readers want to get their hands on the right information, fast.
In this 1 hour session, we focus on key themes like:- Leadership, culture, and collaboration
- How to increase engagement and lower time to resolution across service channels
- Prioritizing and measuring the right things that make an impact on your business
Get tailored, AI-generated personal recommendations based on what your customers need, when they need it. Unify all the touchpoints across your customer's journey from awareness to purchase.
AI's ability to actively listen, learn, and respond will revolutionize the way you engage with prospects and customers.
Make every experience relevant with Coveo

Hey 👋! Any questions? I can have a teammate jump in on chat right now!
