Hello. I'm Kelly Murray. I'm the chief engagement officer at the Consortium for Service Innovation. Welcome, and thank you for being here. The consortium is a not for profit think tank made up of about seventy member companies, and together, we are the caretakers of the KCS methodology, along with other models focused on reimagining the way we work together. It is, probably not a surprise that AI has really influenced a lot of the conversations recently about the way that we work together, and especially over the last few months. Members of the consortium have been very generous with sharing their approach on the changing technology landscape. And you can find public resources on the website around topics like getting started with AI, automation health and predictive customer engagement, and a whole handful of examples about how consortium member companies are approaching AI implementations. One of the most consistent things we're hearing right now is about, how organizations that have implemented KCS are uniquely positioned to harness the power of recent AI tools, because they have access to content that's accurate, structured, relevant, and in context. We love context. And that's exactly the kind of data that generative AI needs in order to be most effective. If you're interested in hearing, in case yes or hearing more about the AI work that the consortium's done this year, We have a couple of additional public virtual events coming up next month, and I will put links in the chat to those things. And then in March, which is right around the corner at this point, we'll be together in person in Atlanta at the annual summit, where we're gonna explore the convergence between knowledge and people and AI. Registration is open, and we would love to see you there. Today, I'm so pleased to introduce Laurel Portner, who's the senior director of digital services at f five. She also serves on the consortium's board of directors. Laurel has been helping the consortium community, understand the nature and the scope of, KCS evolve loop work for the last fifteen years or so. And she was named a consortium innovator in twenty twenty in part for her work on the measuring self-service success project. So I'm thrilled she's here to talk us through her experience at f five, with with that topic. I'm also pleased to introduce Rob Rathwell, who's the vice president of customer success at Coveo. Coveo is a consortium member and also a KCS aligned vendor. So KCS aligned tools complement or enable KCS practices. And in this case, Rob and Laurel are gonna talk about how Coveo's AI powered solutions boosted f five's self-service success rate. We are anticipating time for q and a at the end. In the meantime, please put any questions in the chat. We are recording this call, and you'll receive an email with the recording in the next day or two. So, Laurel and Rob, take it away. Thank you, Kelly. Hi, everybody. I'm hoping that ever I'm just gonna do a quick tech check and make sure that everybody can see, a title slide on the screen. Excellent. Okay. I see some thumbs up and some heads nodding. So thanks, Kelly, for the introduction. My name is Rob Rathwell. I am the vice president of customer success at Caveo. I've been at Caveo for about a year. I was a Caveo customer before I joined, before I joined the company. So I was an end user of the Caveo solution in both the service and workplace models. So, pleased and thrilled that, Laurel has, joined me today so we can share, some of the some of the successes that f five, have seen. And and, really, this is a story about AI in action. So this goes beyond the theoretical, and these are some real time tangible, you know, use cases, stories, and value elements, from from deploying AI in a real time solution. So without further ado, I will get into, I will get into the presentation, and I would be remiss if I didn't start, with just a little bit about Caveo. As Kelly mentioned, we are a consortium member, and we've been around for a while. So, we are definitely an AI search first self-service platform. And that that's one of the keys to remember is that, when you deploy Kaveo, it is a single AI platform, and we bring AI to every point of experience within the digital journey. We have about seven hundred enterprise plus customers, and we're a company of about seven hundred people. We are proudly Canadian, but have a global presence, both of customers, as well as employee base. And we've been deploying AI platforms for more than a decade. So, obviously, with, you know, ChatGPT and all of the buzz around AI, certainly, around the COVID timeline, it has become one of the you know, AIML is one of the most overused phrases that's been out there for the last three to five years. But we've been at this for a while. We are, you know, proud, partners in several forms, including this one, and have been recognized as a leading employer as not not just in Canada, but, across the globe. And, lastly, one thing that I always like to mention and makes me a proud Cavean is, our one percent pledge. So we donate one percent of our time and technology and profits, outside of, of our company, but we have added a fourth element to that. So we also donate one percent of our equity, and that's to help democratize access to knowledge and education around the world, something that we're very proud of. So that's a little bit about Kaveo. And so let me talk a little bit about what does that mean for you. As I mentioned, we are a search first company, so we think that everything starts with a search. The results of that search are where we start to differentiate. So, we feel that the future of every interaction is business to person. So we all talk about, are you a b to b company or you're a b to c company? Regardless of which category you fit into at that macro level or even if it's both of those categories, ultimately, the future is really about business to person. It's about an individualized relevant experience. As Kelly mentioned, earlier, everything is about relevancy and and an individualized experience. It's about having that spinal ability, to generate those experiences across every interaction so that it's an intelligent connected journey, and it drives superior business outcomes not just for you and your business, but for your customer and their business. And lastly, we think that that the combination of those things brings to you what we call the AI experience advantage. So whether you've deployed Kaveo in a commerce business, website, workplace, or service business, We are looking at, obviously, various potential elements, and and I'm gonna turn the microphone over to Laurel, who is the store star of this presentation very shortly, who's gonna tell you some real time examples that that f five, has has realized as part of their partnership with Kabeo. But whether you are in a commerce business looking to increase revenue and profitability, whether you are in a service business where you're looking at customer satisfaction and optimizing your business and cutting service costs and efficiency and proficiency. If you're in a website business where you're looking for just better engagement digitally across your journey and making sure that your success rate is better through search or whether you're in a workplace business, which is around proficiency and efficiency and self-service. Again, all of that starts with a search. It will bring you all of these outcomes regardless of your use case. And the key here is that it brings a balance, that terribly difficult balance between an outstanding customer experience and your ability to scale and achieve efficient business outcomes. So with that, I wanna very quickly, take the opportunity to introduce Laurel Portner, who is the senior director of digital services at f five. As Kelly mentioned, a board member with the consortium. She is an influential leader in knowledge management and an expert in KCS. She has received the consort consortium innovator award. Proud to say that she's a a former Cavean. As a newer Cavean, I still see Laurel's name on a lot of our historical artifacts. And so I've learned from her before I even met her, and obviously has experience in knowledge, education training, digital experiences, and more. And with that, I'm thrilled to have her share the f five story and their journey, with Kaveo. So with that, Laurel, over to you. Alright. Thank you, Rob and Kelly. Very excited to be here and share with you all and, excited to also hear your questions, at the end. So, yeah, let's dive right into, first of all, just who is f five. They're they're a company that's it's kind of a secret, and a lot of people don't know about us, but we are we are everywhere. We secure and deliver half the world's apps, and we're committed to bringing a a better digital world to life. So we've we were founded in in nineteen ninety nine. We've got over six thousand employees. We're headquartered in Seattle. And, and so that's you know, we're we're a global company. Nearly half the the websites in the world are are going through our products, either hardware, software, or or SaaS products. So it's, it's an exciting and, an an interesting world to be in, with with bringing digital. And in terms of what what my team does, is is digital services is focused on the the ownership experience. And so we are are focused on our customers post sales and supporting our, our support teams, our services teams, and and our our customers and partners. So so if we can, a little bit about the portfolio. So in terms of, I said, hardware and software and and SaaS. So our big IP is our our flagship product that we started out with actually, in in the gaming industry and, trying to make things go faster. And, and so we've got hardware and package software under that. And if you build this out, we've got two other two other products. Families are our NGINX, which is software and SAS, focused on application delivery and security, and also, our newest product family, which is is digital, distributed cloud. And, that includes our our SaaS delivery and and a platform for the hybrid multi cloud, environments. So, so I think that's a summary of f five. And then how do we use Coveo? We use Coveo enterprise wide on all of these different web properties. Like I said, my team is is more focused on my f five, but we work very closely with several of these other sites, including our community, which is dev central and our corporate website, and other documentation sites as well. And so it's been great to use Coveo over the last, coming up on six years now where we've indexed content from across the company. We can bring it to our customers and our our employees as well, through the search platform. And, so that is kinda where we use it both on the, end view on the customer side and the employee side. So this may be a familiar story to some of you where about nine months, twelve months ago, the executive leaders came to us and said, we've got to get in the AI game. Right? And we've got to figure out how to launch, some AI tools and figure out what's the what's the best solution. And so, we knew about Cobayo. We knew about, the road map and what was coming. And, and so we set out to prove out, whether we were going to buy something, we were gonna build something, or or create something. And and I would say that f five is doing all three. We've got many, many teams across, across f five that are all trying to do this same thing and look at their their different disciplines and figure out, you know, how do we get to be more effective and optimize and, and, you know, and and stay stay in the the AI game as well. So so that was our mission. And Now, Laurel Yeah. I the the the build out of this slide, I kinda held back on it on purpose, and we'll we'll we'll certainly revisit this because, I mean, that in itself is a very daunting mandate. Right? Go out. Do you build? Do you buy? Do you do a combination of both? But this to me is the kicker. Yes. Yes. We need it we need it in ninety days. That's right. That's right. And so, so that definitely upped the game, and prioritized, quite a few things for us. And it kinda reminds me of, Patrick Balancioni's five dysfunctions of a of a team, and you really learn about your teammates and what you're doing. And, you you basically zero in on the on the task at hand and the goal, and that's and that's what we did. So we pulled lots of people together. And, and if you go to the next slide, we'll we'll talk about kind of what our approach is. Oh, yes. And failure is not an option. That's the other the other part of this. We will have something. We just didn't know exactly what it was going to look like, what it was going to help us with. And so so, yes, we, we definitely had lots of meetings upfront trying to understand what are all the use cases, and I'll go into some of the the process of of what Perfect. Like Excellent. So so you've got this mission. Do you want it fast or do you want it right? So the answer is yes. That's right. Yes. Yep. And oh, and by the way, don't get it wrong. Failure is wrong. Okay. And we have to have a return. We can't, you know, we can't just put something in place and, have it be a sunk cost. Right? So Right. It needs to provide benefit and value to the company. So let's Got it. Focus on all those things. Yes. Yeah. If we were if we were in an actual room together and I asked for a show of hands in the audience of how many people you know, insert your use case here. Yeah. I'm guessing a lot of hands would go up around the, yep. Do it, do it fast, do it now, do it right, and, don't fail. Mhmm. So as Simon Sinek would say, you know, we start with, start with the why, and now let's take a look at the what. So Laurel I mean, this even this is a is a big list. Yeah. And this is certainly just a a very small snippet of of some of the things that we were looking at across, across the disciplines, like I said. And and, the the wire graphic is is kinda how it felt at first trying to figure out, you know, who's doing what. There were people who were doing the same thing and and just, you know, trying to align. And, so it took took a few few days and or weeks just to do that. And then, and then settling on, you know, what are the ones we're gonna go after? What are the ones that we want to to partner with our IT organization and other, other departments as well to to start to, you know, figure out what would be the best thing and also the the road map. So it wasn't just a one and done. It was we were looking both short term, medium term, long term at all these use cases and figuring out what what made sense, for each of those. Got it. So the next thing that we did was, we focused on people. So, really finding the scientists that you know, failure is not an option, but it doesn't mean you can't fail. It means that you've gotta fail fast, and you need to learn from it and move on and figure out what you're gonna do about it. So so those were the people that we focused on with the the growth mindset and also skills and some of the capabilities that had detailed experience of those use cases. So we we took people who had taken cases, who had, played with AI tools, in the past. We're comfortable with, you know, some of those. I'm not really sure what this thing does, but let's go see what it can do and gathering that feedback. And so then we wanted to see, you know, how were we going to define our success. So we had to to create our own scale of of testing very quickly and align. We had a lot of, people in the same room and talking it out and walking through some of the different tools and running them through different scenarios and talking about, does this make sense in terms of how we're measuring it? So even the measurement part of it, took a little bit of time to align and calibrate across, you know, the different teams. So and so one of the things that we we observed as part of this process was, wow. This this sure feels a lot like KCS and and implementing the KCS program. And, and really, it it it was so akin to that where, I I think that that is also why we were able to do this so quickly. We we showed the the different phases of of, you know granted it was it was much, quicker than probably most KCS, implementations. But, again, focusing on let's figure out the adoption, then get the transaction loop of, you know, that motion of testing things. And then and then take a step back and say, okay. How is this working for us? Let's maybe improve on that and and run it through again. Got it. So you've got your mission. You've got your timeline. You've got your mandate. You've found your scientists or your astronauts. Yes. And I go I'm I'm dating myself, but I'm a big fan of the Apollo missions. And there's there's only so many astronauts that can fit in that little, you know, that little spaceship. And you've gotta you you need mission control. You need all kinds of help just like NASA does. So Right. How, you know, how how did you get to that next step of, you know, how you know you're successful? How do you pick the right partners? How do you make sure you have that extended team in the room with you? Well, one of the things that that we did, and this is this was some of our our measurements of, you you know, put yourself into the user's shoes. So it wasn't just, you know, for customers. It was also for our employees of, okay. I'm getting this answer. Does it make sense? Is it something that I could use in my work flow? Right? And then how do I score that, and what are we going to use to to measure the the success? So it wasn't just about accuracy. It was usability. It was, user sentiment, and and something. Could I just copy and paste it, or was it something that I was gonna have to tweak it a little bit? So for a lot of you that have used, AI tools, it it's not something that is you know, it just magically happens because you put in the the right prompt. Correct? So it's it's, learning as you go of okay. You put in that prompt. Oh, yours is much different than mine. Why? Right? And what do we need to do to to make it more consistent? So so here's kind of some of those quality rating definitions of, you know, did we have to completely overhaul the answer, or was it something that that really just a minimal amount? And that was that was the goal that that we were looking at. Not to mention cost benefit and also, you know, is this something that we're gonna have to completely revamp our architecture where where we're gonna have to, you know, you know, have, developers involved. That was gonna push this out, right, the in terms of timing. So lots of different variables, that we we used in in the measurement of this. Got it. And so here is some of the, the the measurements and our responses from our testing results. I I also wanna point out that this was the final one. So it that it was interesting that we went through several iterations of this, and we did some tweaking. We we did, and it was very interesting to look at how we learned through the testing process. People learned how to use the tool. Right? They learned more about what AI does, what and how their prompts could influence, the accuracy and, and what the output was. So just in a matter of of weeks, it was it was so interesting, to see, you know, the the results of that. And it it it was the people side. It was I mean, the the tool also learned as well, and, you know, we we brought in different sources and different datasets and things like that. But the fascinating thing for me was was the change in the people just just in a very short amount of time. So Yeah. That's great to hear. I mean, there's that was that was the first kind of big, chat GPT AI ML myth, right, was that, oh, people are getting replaced. And yeah. Right? And and this is really just and I I know you're gonna talk about it, in a little bit with some of the actuals from the f five experience about it's not about that at all. It's about how people start to get more efficient and proficient, and you just optimize where the human interface is. It's the most exciting part for me Yeah. You know, kinda looking into it. Right. Exactly. So, why don't we go into, a bit of, real time demo now? I think this is my f five. So Yes. I'm gonna steer the demo, and and, Laurel, you can go ahead and voice it over. Okay. Sounds good. Yes. So this is our ownership platform, and this is both, an unauthenticated and an authenticated experience. And so we've got our our search box that's been there for years, with with Coveo and at the top and and just putting in a a fairly basic common prompt from our customers on on how to upgrade the software. And so you can see the the results using the, using generative answering and retrieval augmented generation, which is super important with this, because you get the citations. You can see that not only did it give you a generated answer, it also cited its sources, plus you still see the the results the search results below it. So you've got some options. There's feedback right there. And, yes, you can you can go to Maya five dot com and and do some searches on your own. And you get you have that similar search experience, but it's also got a generative AI. So we also implemented it internally. We're using it in a in a few places inside our our live chat. We use Salesforce. And so you can see we're also one of the things that we saw that with this is we are using both Einstein and and Coveo at the same time because we saw value in both of those in different use cases. So that was super interesting to see. You you can go across multiple, sources on using the left generated answer. You can copy that. And then the the Einstein recommendations, are a little bit more embedded in the workflow, and they're a little easier with with single clicks. So those are a couple pros and cons to to both. And, yeah, Salesforce Einstein is the, is their AI is Salesforce's AI platform. We also use Coveo in our case management screens using the insights panel, and we added generative answering onto that as well. And so, this is just a a quick view of of what a generated answer, same type of thing. You've got your citations. If you scroll down, you see the the search results as well. You can also toggle that on and off, and we see different use cases from our our support engineers that use, the generative answering. And there's times when they're just looking for a a document where they may turn that off. So some flexibility in here. Yeah. Thank you for that. Yeah. So I know we're gonna get, into, you know, some of the actual results that you've seen, and I know, I I think the result slides are a few weeks old now, so that's a lifetime as we've talked about. So I know you've got some updates there too. And I'll just you know, we've we've learned a lot, so far about the Kaveo technology, but it's really only one leg of a three legged stool. As, you know, all of this audience knows, there's people, there's process, and technology. And at Kaveo, yes, obviously, we are a software vendor. If you look it up, that's the probably the first thing you will find, you know, on Google is that we are an AI vendor. However, we are, a partner with our customers, and our customer success is our success. Some of the successes that, Laurel is gonna share with you at f five, we're equally as thrilled about at Kaveo. And, you know, as part of that stool, we ultimately focus on on three things at Caveo, and that's the first is the adoption of the software. So we've talked about the mission and the mandate and the challenges that f five had, the timeline. Do you want it right or do you want it fast? And the answer is yes. All of those things, even if they all come together, the best software in the world is of no value if you can't use it effectively, efficiently, and at scale. And it's one of the it's one of the things that we're most happy about with with Laurel and the f five story is it's a shining example of how those things can come together. And when you look at the requirements on paper in the early days, they're extremely daunting. It's a massive challenge. But we're big believers that you can boil the ocean as long as you do it one cup at a time. And if you get those cups in the right order, then some of the results we're about to share with you are are definitely achievable in in a very aggressive timeline. The next thing is about value. Our customers partner with us, because we are most likely to bring value to their organization. We definitely realize that, and we deliver on that trust. It's not something we take for granted. There is no bigger sign of trust than, when someone chooses Kaveo to partner with them, and it's our job to ensure that we never break that trust. And we will measure it, we'll capture it, and we'll prove it along with you, to your stakeholders and your customers. And then lastly is the innovation at Kaveo. Obviously, AI is a bleeding edge technology right there where, you know, Laurel's gonna share some of the actuals with you. And as soon as they become weeks old, they're almost stale or obsolete because the technology evolves so quickly. So when you do invest in a partnership with Kaveo, you're not just buying software. You're building up you know, you're establishing a partnership with someone who's going to accompany you, and join you in your business transformation, and you're also subscribing to our innovation, not just our software. So when you look go all the way back to that that mission statement. Right? Do you build it? Do you buy it? And you gotta do it in ninety days. All of these things have to come together on our side to help make sure you succeed. And with that, I wanna give it back to Laurel so she can talk about some of the specifics of some of the f five successes that they've seen both internally and externally with their customers. So Laurel, back over to you. Alright. Sounds good. Thank you. Yes. I completely agree with all of that, and I'll get into that a little bit more. So so, yeah, right off the bat, we saw an improvement in our self-service success rate in terms of, we did some AB testing and, super easy to do with the Coveo platform. And, and we're we're starting to look at that in more detail as well. Like, what other, tweaks and and changes can we make, to to the platform to to improve, self-service success? We're also looking at different ways to measure, and I know Kelly said that we that, part of my work a few years ago was was on the the self-service success, measurements project team, at the consortium. There's there's a lot of innovation around this that I see in the coming months and years probably. So, so I'm I'm super excited to to collaborate with the community, on this because I I think it's it we're gonna do a lot of learning around it. So but, yeah, that was that was super exciting to see. And again, our our return, is is is very fast. It was it was, I think it was six months, our return on investment. So so the other thing that I wanted to point out too is just why do we think this was was so successful? Why do we think that we we got this stood up so quickly? I think that, you know, what Rob was talking about in terms of how Coveo is in terms of cocreating value, with our partners, We absolutely saw that. I mean, I I would not have been able to get this approved so quickly without the help of of our customer success manager. And they're still right in there with us every week, looking, learning, trying to tweak, figuring out how we can improve. But the other part of this is the data. Right? It's the content. It's the context. And knowledge centered service, KCS, absolutely empowers you to be able to leverage, to leverage that with these large language models. And so we started out, in twenty eighteen with, with Coveo. We were early in our KCS journey, and we've been able to, you know, improve the the the landscape of the issues coming into our assisted support channel into a higher percentage of new issues, and more are being solved on through self-service lower touch channels. And, and so I think that we have doubled the size, and now I think it's that's that's actually a year old. We doubled the size of our knowledge base. We're now over forty thousand articles. So we've got a lot of rich, data to work with with context to help with, the the AI and machine learning. So and let's go to the next slide to talk about what is next. Yeah. You've, you know, you've to stick with that Apollo analogy, you've been to the moon. Yep. So what's next? Do you colonize the moon? Do you go to Mars? What's what what's next for x f five? Well, it's funny. Somebody asked about the change management, and I think that you don't have a lot of time for change management if you have to implement something in ninety days. So Yeah. So we we definitely are focused on that and notice that it's, it it definitely impacts your your metrics, your success, when you find out how people are using the product and help them use it better. And Coveo was helping us do that. We've got our our customer success manager did some shadow sessions with users to to identify, areas where we can do some training, where we can gather feedback. Plus, we've also been partnering with the the development team and giving them feedback on the product. And, their latest release, I believe it doubled the the answer rate, just with that feedback. So we went this is some kind of later, measurements that we don't have on the slides, but our answer rate was about twenty five percent when we first, launched, and now it's it's fifty percent. So so that was, super impressive to see that now fifty percent of the searches that are happening are getting a generated answer. And, again, focused on on the people and also just, having more conversations about, you know, the the myths, dispelling the myths, and the the resistance, and all these questions because it's very scary. And, it's it's it's like, does this sound familiar? Yes. It's very familiar to anyone who's implemented a, you know, knowledge management, you know, program, KCS program, or anything that is involves change. And I think that one of the things that I wanna highlight on here that we have found is, you know, a AI is is not pixie dust and, to take from a a a friend of the consortium. And, it's it needs context, content. It needs data. And in order to get that, what we have found, the the the people who are doing the work, they've gotta get the tacit knowledge documented somehow so that it can be fed into the large language models. Pure and simple. And, however you wanna do that, we're focused on documenting cases, and also knowledge creation. Those are those are the two main things, but there's lots and lots of different ways, to capture what you learn, what you know so that it can be then reused by by AI. So Excellent. So with that, I think it actually brings us to our formal q and a session there. I think we've answered a couple questions in the chat. Mhmm. But, just looking through it, why don't we just open it up and, see if there's anything we haven't answered in the chat or if if anyone wants to ask us anything in real time. If you wanna stop sharing, Rob Oh, I can yeah. Of course. Sorry. No. That's great. I was just gonna sort of add to the in terms of the the change management piece. Right? We've talked about how KCS really sets up, organizations to be successful with AI because it gives them a practice around how to deal with that information and the data that AI needs. But I think, Laurel, what I heard you say also was perhaps your your years long KCS practice has also given you some the change management tools Yes. To address sort of the, any the objections that come up. Right? And the Mhmm. You it it provides practice both in capturing capturing instruction content and in managing the change around Right. Application requirements. I mean, we've got a robust coaching program, and we are leveraging it, to the hilt on this and making sure that, we our coaches are the are kind of the the spokespeople and and the you know, how we're getting some of this and and also our feedback mechanism. You know? Understanding what's hard. What what's hard about this? You know? Where's where does some of that resistance come from, in terms of using the tools? And, and I think that, you know, that has really helped us, and is going to help us continue to focus on improving this. I I just have to I I read, an article last night. I just thought it was interesting where it was a study of doctors that are using CHaT GPT, and they found that that CHAT GPT, diagnosed problems better. And the and the reason was because the doctors didn't know how to prompt. And and so if you just fed an entire case, you know, of chart notes and all this data, the diagnosis from the tool was actually better because the human was trying to prompt and put in words that that did that it's, you know, didn't do as well. So I just thought that was super interesting where, you know, once we get some of this prompting and we've actually done some, we've created knowledge base articles for our customers and for our employees to help them learn how to use AI because it's very different than search. There you know, it's it doesn't it doesn't work the same. And so you can't just put in a few keywords like you're used to, and, it's not gonna give you the results that you need. So super interesting on that. That's that's so funny. It makes me think about, like, you know, we kind of joke about you have to teach people how to search inside, right, knowledge bases because we don't have the volume to do what Google can do and answer questions with a word and a half. And this is, like, that times a million in terms of, like, give give me all the information. Like, just keep typing. Yeah. Yeah. Yes. No. More context, please. Yes. Yeah. Marie had a question about, monitoring tools to measure how well I think the, the generated the generated solution is doing. Yes. I Coveo has extensive user analytics that we that we use, all the time, and looking at click through rates and answer rates. And, one one was that that, I really like, that we're starting to to measure and and promote is the number of searches by a user, and we're starting to see that go down, for for each session. And so that is, you know, hard data on it's reducing time for your users. You're not you're not having to put in as many you know, when an answer is generated, there's about half the number of of of searches whereas, compared to one where an answer wasn't generated. So you're not having to read through every document in your search results. You've got a summary right there. And, again, it's it's, you know, based on your prompt and your use case and all those all those wonderful things. But on average, that's that's one of the the metrics that we we're looking at. Yeah. I'll I'll just add to that as well. And I am, I am probably the least technical person on this call, so I don't wanna get out of my depth. But one of the things that, we also ensure and to way oversimplify it is, at Caveo, we have AI for the AI. So it is preventing hallucinations and and, false responses or or nonrelevant responses. So that actually in our solution at this point, and and, Laurel, you can get me honest here, but that's not really the concern. So one of the other measurements that we keep very close track of is if there's no response. So if a if a query doesn't yield a response, that's problematic. When you're in the when you're indexing forty thousand documents Right. There should be a response somewhere. So that's another one of the indicators we use. Through the testing phase, we'll make sure that the responses are relevant and that they're not reaching out to sources that they shouldn't be or anything like that. But, ultimately, once you're deployed and live, every query should get a response if we've indexed the content properly. So that's one of the other elements that we monitor very closely. Right. Yeah. And I just saw a question flash that we are grounding, the the LLM in in our knowledge base. So that's that's part of the the reg and vector database that that is part of the the plumbing behind the scenes. So Is it also in the historical cases? Yeah. Internal, we we're using cases as well. Yes. Thanks. Mhmm. Kevin asks, how can we use the AI powered system to foster a culture of sharing and collaboration within our organization, encouraging employees to contribute their expertise and experiences? Yes. That is a very good so, I mean, it it's kind of something that's that we hear at the consortium. It's it's show show them the value that they're creating. And that's something that that we are very, cognizant of and also intentional. Every every time we have a a town hall, we're showing people the value that they created in the the content, that they have published, not just to customers on self-service, but also reuse, and what's what's happening, internally and sharing with their with their coworkers. So I think that that is key. Another another area that we have indexed is our collaboration tools. And that is starting to provide some interesting, you know, results and feedback where just a a conversation in Slack or Teams or whatever you use. And if you're, you know, doing intelligence swarming, if you can index that and search, that is providing value. So, again, bringing bringing people, and then you see who the people are so that you can connect the people to other people when you have a a new issue that you're not sure about and there's no content out there, and you've gotta create it. So Could you share insights on the build versus buy dilemma? Oh, something just jumped. Hang on. Yeah. Every organization faces this challenge. Are there any use cases for choosing build? Oh, yes. Absolutely. There are. And in fact, we were just, looking at one, that some of our engineers had had built, that it's just a a nuanced, you know, use case that somebody saw a need for. And, the the issue that you're that you're looking at is operationalizing it and and productizing it. Right? So with Coveo, it's out of the box. It's, you know, it's it's, you know, implement it, figure out what sources you're gonna you're gonna index, and essentially turn it on. I mean, it's, but you have to go to the website. You have to go to a particular area of your CRM system or whatever it is, and it may not be exactly in the right spot that you need it. Right? Whereas building something to either call an API and put it directly into a work flow, or something like that, then or maybe it's it's, analysis, about KCS because that's another area that we're looking at where we're we're using LLMs to the massive amounts of data that you get in support and services. You know, what do we do with this, and what insights can we gain? So that's that's another area where you would use, you know, a build versus maybe a buy. So Yeah. I'll just I'll expand on that and this coming from the vendor. Right? So, but, we don't even look at it as a build versus buy. It's what should you build versus what should you buy because it's a combination of both as Laurel said. And the key for us and remember, I was a customer, of Caveo's, before I joined the company. It's it's not necessarily about the build in the moment. Now I would say when you get that when you get that ninety day mandate, it's pretty tough to build and deploy. So you gotta you know, there's some advantages there. But the build dilemma really hits you when you get into main maintenance and scalability, and and that would be that would be what I would look at the most. There are certainly some elements that you can and should build, as opposed to buy. But at a certain point, they might have a shelf life. So, it it's about what do you want your core competency to be. Do you want it to be about maintaining, you know, AI elements of your solution? As you scale and have to maintain, that's where the consideration comes in from my perspective. But it's certainly I agree one hundred percent. There are absolutely elements of build and elements of buy that should be considered in every decision as you move forward in the journey. Yeah. And I'll add one more thing, which is, the ability to AB test was absolutely a requirement for us to get funding. And so we and sometimes you actually have to build the AB test, and that takes more time. So this was, again, already baked into the platform. It you literally just turn it on, and you had an AB test in twenty eight days and could could measure. And so, so that was that was the other key key element of that in order to get funding for for the investment. I would love to introduce and say hello to Kendall Branaise, who works with Laurel at f five and is doing a lot of work in the chat answering her questions. I've been about sixty to eighty words per minute depending on how much time you have. Are there questions, particular. Let me see if I can find it from Vijay Watkins, asking around what metrics do we use to track self serve success rate. So, Laurel, would you be up for going a little bit deeper into that, that particular number that you showed? Yeah. And you you jump in too because, because we we are using Kaleo Analytics, but we also use Adobe Analytics. And so we all we wanna see the where everyone's coming from. So you saw the the eight or so web properties, that that we're using Coveo. We can see where people kinda that click stream of of where users are coming and going. But Adobe also allows us to see that where especially when it's referred by a search engine, and and where they're where they're going after that. Adobe is is something that we're looking at. And in terms of our self-service success, we we use, a measurement which is article exit. So if you land on, an answer, and and then that was your last event, that is a self-service success. And then we also are using feedback to triangulate. So we've got we've got several different surveys on the site and also in the generative answering with the thumbs up, thumbs down. So did I miss anything, Kendall? Just one other piece is, that's after we've excluded things like bot traffic and bot behaviors. If any of you aren't looking at bot behaviors, we would definitely recommend doing so considering there is a population of your viewer base to your portals that are probably not human. But, yeah, after excluding bots. Right. It's we keep looking for that one magic measurement, and it just doesn't exist in terms of That's right. There were I wish there were a single number or a single measurement we could use, but Yeah. We have to make some assumptions about what's happening. Mhmm. So wonderful. Any other last minute questions here? I think we have time for one more. Or anything else, Kendall, we should poke at from the chat? There was a question, I think from where did it go? I lost it. It was around, providing a framework for, how we went about achieving this. And this is a really long question. In fact, we could probably do an entire hour just on this, if not two or three. So forgive me that I might not be answering this person's question. But, when it came to testing, what we did was we targeted our testing against known issues. That was our framework, if you will, because we knew that this particular use case we were exploring wasn't to help solve or or, accelerate new issues. It was to really target known. So So what we did is we handpicked the top known issues, repetitive and knowing ones that we would see in the support organization and tested the top one hundred of those. But what we also did to try to remove human bias, we, distributed that across many testers. So just because a, test was run once, didn't mean we were done. We would actually run it two or three times with two or three different people, each with their own kind of unique prompting to try to reduce or aggregate that human bias in testing. I don't know if that answers the question for the one individual. I would I can't remember what slide it was, but there was when I was looking at all the different, measurements for success, there was one. We partner with our our IT organization, and they, they've got a business transformation office, that has built out a framework with a lot of our input, as we have we were doing this. So, I mean, it's very new, and it and we're iterating on it. But there's also a major decision framework or MDF score that had some of those other things like the architecture and, you know, the build versus buy and and and things like that. So they really helped, define that piece of it, as they should from a from a platform tool perspective for for the systems of the company. So Yeah. Very and as you said, Kendall, we could probably do an hour, just on just on some of these metrics. And and, Kelly, to your point, there is there's no silver bullet here. There's not that one perfect metric, but and we will definitely one of the questions was, will we share out the slide deck and we can definitely do that. But I know one of the things that, f five also touched on that can be actually a deceiving metric if you don't have the context is very similar to the build versus buy is the known versus new. Right? So as as known issues become more and more self solved, you can focus on the new issues, to start to build that knowledge base. And it can start to look like your time to resolution is going up. It can start to look that's not the case at all. That's actually the optimization that you want, so that you can continue to scale over time. So every one of the metrics we talk about has context to your business and where you are in your journey. Absolutely. So that I mean, that is definitely a whole a whole another, a whole another call. That's right. That's right. Mhmm. And and getting ready for those metrics to move in the wrong direction. Yeah. Sometimes it's very helpful to set some executive expectation around that, but that is actually an indicator of success. Our humans are now working on the new stuff. We're not reinventing the wheel over and over. So Yep. Wonderful. Thank you so much, everyone, who has joined us. Questions that we did not get to, we will reply to afterwards. Thank you so much, Laurel and Rob, for your time and for sharing your story. I hope to see everybody again at another consortium event soon. Have a great day. Have a great day. Appreciate it. Bye, everybody.
AI in Action at F5 with Coveo
Hear from Rob Rathwell, Vice President Customer Success at Coveo, and Laurel Poertner, Senior Director of Digital Services at F5 about how implementing Coveo’s AI-powered solutions boosted F5’s self-service success rate by 11%.
Self-service platforms powered by AI enable customers to find answers faster and more accurately, reducing the strain on human support agents and improving customer satisfaction. By leveraging AI, F5 saw a reduction in case submission rates and higher engagement with its self-service portal. Watch to learn more.


Make every experience relevant with Coveo

Hey 👋! Any questions? I can have a teammate jump in on chat right now!
