Hello, everyone, and welcome to today's live webinar lessons merge. From search to generative answering, brought to you by technology and services industry association and sponsored by Caveo. My name is Vanessa Luzero, and I'll be your moderator for today. Before we get started, I'd like to go over a few housekeeping items. Today's webinar will be recorded. A link to the recording of today's presentation will be sent to you within twenty four hours via email. Audio will be delivered via streaming. All attendees will be in a listen only mode and your webinar controls, including volume, or found in the toolbar at the bottom of the webinar player. We encourage your comments and questions. On the top left corner of the webinar player, and we will open it up for a verbal Q and A at the end of today's session. Lastly, Feel free to enlarge the slides to full screen anytime by selecting one of the full screen button options, which are located on the top right corner of the site player. I would now like to introduce our presenters today. John Ragsdale, distinguished researcher, Vice President Technology ecosystems for TSIA. And Patrick Martin, chief customer officer for Cavell. As with all of our TSIA webinars, we do have a lot of exciting content, for you. So let's jump right in and get started. John, over to you. Well, thank you Vanessa. Hello. Everyone, and welcome to today's webinar. Absolutely, generative answering, generative AI, Gen AI is one of the hottest topics and support today. Our support research team is getting an awful lot of inquiries around this topic. And I'm thrilled to have Patrick with me today. We're going to be, learning some of the lessons learned and best practices from the PACE setters who are already experimenting and using, this technology. So it's fantastic. To be able to go from concept to reality, and really help you see what the early adopters are learning from this. So a shout out to the support, research team, Dave and Sarah and Kevin for sharing with me, this data from a recent survey they completed about, how people are adopting GenAI and the impact on knowledge management program. And we see that currently, twenty seven percent of our members say that they are already experimenting with this technology and sixty four percent are planning to start experimenting within the next six months. So we see that, the majority of companies are using it internally within support for their tech support agents. Fifty percent are using it externally with customers. And forty seven percent are really looking at cross business unit use cases. And I think, that that is really a good sign, because obviously GenAI can come in handy with anybody that works directly with customers. Similarly, when we look at, the budgets, thirty three percent are funding this through support, but fifty four percent, are getting budget from cross enterprise or cross business unit expenses. So again, I think that it's really good to see that companies are understanding the potential for GenAI across the entire customer journey. Specifically within support, excuse me, sixty four percent of companies understand that GenAI is going to have an impact on their knowledge management program. Twenty seven percent say a major impact. So for all of you who have been struggling to find enough content authors, to generate new knowledge based articles, this is gonna be a real big assistance for you. So when we look at where you should be getting, your Gen AI technology, the majority of companies, eighty one percent, are looking to their unified search, intelligent search, cognitive search, however you wanna describe it. Eye based search provider. And the reason this makes sense is that system is already doing some analytics to understand what is the best content to answer a particular question. And when you bring the Gen AI on top of it, without that pre learning, it's gonna take a while because most of you have a lot of content, probably a lot of mediocre and even bad content out there, and the gen AI tool is not gonna be able to decipher what's the right answer or the wrong answer from all of that content. But putting it on top of, some machine learning that is already done that heavy lifting from your employee searches, your customer searches. I think that's going to really help eliminate some of the hallucinations. That we're hearing about. But I think, Patrick's gonna provide some real world examples of that today. The final data slide I wanted to share, I recently completed, the twenty twenty three technology stack surveys. To understand adoption and plan spending around various technologies across all of the research practices. And this is taking a look at the plan spending for this year and next year, for some AI automation, knowledge management tools within support. And you see this, the second category AI based technology support technologies, that is, encompassing a lot of tools, including GenAI for employees, and you see seventy percent of companies, had budget for that this year. Forty nine percent, are planning an investment next year. Similarly on knowledge management, intelligent search, self-service portals, really high plan spending and considering the state of the economy and how tight, everybody's budgets are, I think it's a really good sign to see that support is getting the budget they need to invest in innovative technology. It's going to impact employee productivity, the customer experience, and we know that impacting the customer experience ultimately, has an impact on long term account value, net revenue retention, etcetera. So the final word I want to say on this topic is I know a lot of companies are rushing, to get this out there for their customer self-service experience, but know, a lot of you have tried previously various attempts at chatbots. Not all of them have been very successful. And in fact, we have some survey data saying that customers are kind of unwilling, to use those traditional chatbots. But customers are really interested in GenAI, and I think that you're going to find adoption of this technology by customers, to be pretty high. So make sure that when you're rolling it out, that you're delivering an exceptional customer experience. You know, I would say do a test with some of your, you know, friendly customers, make sure that you're piloting this, getting some feedback before rolling it out. On a larger scale. So that's enough, data for me. I think it's time to bring in our expert today. Patrick Martin is a longtime TSI member. He's been a support executive for most of his career, and now he is chief customer officer for Coveo. Patrick, welcome back to another TSI webinar. Thanks, John. Agreed to be here. And definitely an exciting topic, video for for this audience, because of the lack past twelve months have been somewhat of a whirlwind, for for the industry and and ever since Chad QPT, you know, came to, to the came to life, if we can say. It's been, it's been pretty, pretty exciting journey. There's a lot hype around it. We're seeing it, and I'm sure, the whole TSA community is seeing it as well. So As, we we we talk today, I'll put a little bit of background information as to, you know, while we're talking about lessons learned today, For the last six months, approximately, we at Kavail have been working with approximately, you know, forty to fifty of our customers around, you know, leveraging GenAI mostly in the this self-service part of the of the house. And we've been testing that out with with some of our customers. They've been helping us, you know, design, you know, the the the whole product and approach here. And along the last six months, we've learned a ton of of, from them. And what we wanna do today is really focus on the main three topics that really bubble to the top. In terms of lessons learned. So these are the three things we're gonna talk or cover today. The first one definitely touched on it with your your stats, John, is, the content strategy and the importance of the content strategy a lot of our customers go into this, and they're thinking that, oh, we're gonna just leverage all of our documents and, you know, the l o m will figure it out. Well, the reality is, most of our customers who are getting the best results right now are actually working with the subset of their content and not their entire content. And we'll get into into that a little bit later Second one, also, you touched on it, John, around the experience with chat pods. And now everybody's curious around saying, okay, What does that mean, you know, when we think about bringing GenAI and making it conversational? What does that mean for my self-service channels? How is all this gonna converge or how is all this gonna play out? And the last one we're gonna talk about is is the analytics. You know, we've been for a very long time as an industry measuring self-service success, implicit, and explicit case deflection. And, you know, we've been looking at, you know, content consumption and all that. But, the reality is is Jenny is gonna change the entire scopes we'll talk about that a little bit as well. So looking into the first topic, which is content strategy, just taking another step back back. If we look at the search experience, this is what people have been accustomed to. Right? So they have a search box. You can put their keywords, whatever the the They believe in most search providers out there will have, like, query suggests, query suggestions that are personalized to whatever, you know, in information you have about the customer. And then they can the user can navigate your content in any way, shape, or form. So, you know, whether it's it's new content or old content or you might have some things that are similar. Like, if you look at the middle line here, you have how to reconcile account easily in the system, and the other one is how to reconcile your bank accounts step by step. You know, they're they're complimentary in nature, and that's fine because the user can use you know, facets and different ways to navigate through your content, and the experience is pretty much on them. So you can definitely leverage all the content that you have in your, your search, will, you know, display the most relevant information for that. However, when we're thinking or moving to generative answering, now we're more into the world of people asking questions in their own language. And what they're expecting is an answer such as this one. And where content becomes, you know, very important when we think about the content strategy is now, if you're not leveraging the most, updated content or the most relevant content for that particular user This is where you start to get into hallucinations, territory, where you could take I'll take my two examples, you know, that were very similar If you combine these together to generate an answer, it could end up being a wrong answer for the users. You have to be very thoughtful around the content that you want to use, how are you gonna use it, and how are you gonna make that personalized to the user because, you know, it could generate some wrong answers. It could always say there's a lot of things that can be done. So when we start thinking about your content, per se. The first step that we've learned in our journey right now is all about really putting some emphasis on the right information. So, definitely, you turn to your search analytics and you you leverage that to really understand which content or which document types are actually yielding the best results right now. Which ones are being consumed the most And this is probably what you should focus on is, okay, let's say our, the, a mix of our product documentation and our knowledge base are are the ones that are used the most. Well, there's probably no need to bring in your community threads and your blogs and you know, online communities or social networks and all of that because, you know, they're not being used that much. They're not necessarily factual. So you wanna stick to that. So really finding a way to limit the content that's that's used to start generating the content will definitely give you higher quality answers and, of course, they're gonna be more accurate. Second one is about standardization. I mean, we've heard customers wanting to bring, you know, power points, excels, spreadsheets, and a lot of things coming in. And this is all, like, really unstructured. The best results we'll will that that you're gonna get is really by focusing on a standard structure of documents, making sure that the content is well well displayed in the document. You're not having irrelevant fields like footers or headers and all that and bringing in the five hundred page PDF is probably not gonna help either because you're probably gonna have a lot of repeating information that could be, confusing. And, you know, that could generate some some misleading answers. And the last one is really around the content, accuracy. You really wanna make sure that you have the most recent versions that they are factual and that you do have a process in place within your content, your knowledge management strategy to ensure that everything is being kept up to date. A good a good example of that. I'll just take, you know, an example that I like to use is, you know, if you're trying to do something on your smart let's say, let's say you have an iPhone and you go to Apple site and how do I do this on my on my iPhone. You're probably gonna get documents that touch three or four different versions of iOS. But if you don't know your version of iOS and something is being generated, it might be taking content from all these different documents from different versions. So you might not necessarily have the most recent version to generate that Ezeron. So you wanna make sure that everything is actual. And if you need an older version, you can have that in your search results to filter through that. So These are the three things that really come to mind when it comes to content strategy and definitely put an emphasis on this because We've been talking about garbage and garbage out for years, but, the last six months have taught us that, it's taking even a greater significance with generative AI because that's the only way to really, you know, limit the chances of having any types of hallucinations, especially when you're using it in a customer facing use case. If we move on to the second lesson learned, it's about conversational experiences, and I'm gonna go into a little bit of exploratory territory around what we're seeing and what we're hearing from our customers. So one of the things that we're hearing and and customers are actually telling us is that there is a paradigm shift that's happening or at least that they wanna see in their customer experience which is the world of search, recommendations, chat bots, personalization, everything is can they they want it to converge into one. You know, historically, we've had several channels for different purposes, and all that approach has kind of introduced friction in the customer experience, but GenAI you know, our customers that we work with are really seeing this as a way to completely transform their customer experience and remove a lot of the friction So, you know, really coming to the table with a concept of a one stop shop to do everything is really where they're kind of pushing So if you think about this paradigm shift and you put yourself in the shoes and your customers right now is do you really wanna have separate experiences where you have a search query or a generative experience, probably not. So what things are are are tending to go to at least with those customers that we're working with is more of an intent box where you have this centralized place where they can go, and they can ask whatever they want. And what that really brings is this concept of a, you know, single unified intent or interaction mechanism, that your customers or your users can interact with your content. And this brings to the the the new notion of bringing this this generative experience because that is starting to be the new app. It's starting to be the new expectation. So, you know, search is not gonna go away. You know, we can we can call it whatever we want, but people are used to interacting with content this way, but you need to tailor the experience now to be more around, okay, how can I help you? What's your intent? Instead of just really focusing on the traditional search experience that customers or users have been has that been, or has that been a customer too not with the word I was looking for? So if we bring it a little bit to, you know, Coveo, on the left hand side here, we have the traditional search. Right? So we have all the secure connectors that we're able to get the content wherever it resides, and we bring those those documents into our unified index, and through our machine learning models and behavior analytics where you can drive relevant into our results and bring that over to to the the user. This is this is not gonna cut it when we're thinking about moving over to generative experiences. What people are looking for now today is really the function of having these these answers and interacting with that technology. But to get that going, you're gonna need to have this knowledge base of of documents that you want to to leverage, because you can't just leverage an l and l and l and l and l and l and l and l and l and l and l and l and l and l and l and l and l and l and l and l and l and l and l and l and l and l and l and l, You really need to train it on your own material, which means you're gonna have to have the technology to do the extraction in the embeddings. You're gonna need a vector DB and your definitely getting these now around to generate these answers. But this approach really brings a disjointed this jointed experience because you have two different intent boxes. You might be leveraging duplicate content or duplicate data infrastructures, the administration is gonna be separate, and you could basically end up with, you know, different sets of facts along the way. So what we're what we believe in and what we've been working on, and this is, you know, really in line with what our customers have been asking is bringing all of this together. So through our secure connectors, we're able to bring the content. And at the same time, we're able to do the extra in the embeddings, bring all of that within our unified index and make the vector database part of that which really drives all of our semantic search capabilities, which allows us to leverage our relevance models and generate answers. And, that is really where we're able to really ground the prompt, ensure that it's very secure, and we leverage all of our security permissions and all that. We have one set of administration. It's scalable. It's cost effective. And what we see from the, generated answers is that since we're using the RAG approach, where I got to find the most relevant snippets across the entire entire enterprise content, and we're able to generate an answer just solely based on this. So we're not leveraging the LOM for the information it was trained on. We're actually sending it, you know, bits and information within the prompt and saying just generate an answer based on that. So we'd, we reduce the chances of hallucinations, which doing so highlights my first point, which is the importance of your content strategy. Because if you don't have the right content, the end result will not be of high quality. It won't be truthful It won't be current. So this is, this is all very important when we're thinking about about this. And this is why we think that the convergence of these channels will will come to play. So, when we think about that generated experience, this is what this is a good example of of how you can make it more interactive it can bring the customer experience at another level and bringing rich formatting into, into the mix. This example, let's say you're looking for tips on how to build an outside kitchen, with a barbecue, you can actually have, you know, references within the answer that you can start your shopping and you can really start building your outdoor kitchen. Or think about the conversational aspect of things. This is a different example where someone's asking the difference between a personal loan and a commercial loan. So they have their answer. They also have, you know, the citations at the bottom, and then they have some, an opportunity to ask follow-up questions and also recommendations on what others have asked, in the same line. So sometimes they'll even have to think about the thing, but you just want to ask, how can I apply, or what's the best rate for each of these or actually bring in the concept of next best action with have someone call me at this number? So if in this case is if I wanna ask, how can I apply? I'm starting to really have this conversational experience which really elevates your customer experience at another level. And, you know, this is where we see that, and through the search and the chat experience, for example, here are converging into one where you won't necessarily need to have both different problems in this, in this type of exam. This is what, you know, we're hearing out in the field as what our customers are talking about this was keeping them really excited about this technology is really the focus on removing a lot of friction in their customer experience. And that's pretty pretty exciting. And this brings us to the last topic of our lessons learned is around, you know, the journey analytics. And this is gonna be a a real transformation because historically, we've been focusing a lot on on the activity, trying to understand the activity to improve the customer experience around, okay, what channels are being used how is our AI performing? You know, what what are they actually searching for? So the different types of queries which documents are the most popular. You know, how am I doing from a self-service percentage self-service success percentage versus implicit or explicit case deflection. What we're seeing right now when we're trying to measure the impact of of this technology with our customers and really how to think about the analytics is is very different because if you're generating an answer and it's accurate from the get go, you're not gonna necessarily gonna have clicks. And the answer is not generated from one document. It could be generated from, you know, three, four, six different snippets of different documents. So you can't measure how documents are performing. You can't measure this thirty how your AI is performing or, you know, your self-service success because they might not even click on anything because they got their answer. So you're faced with the challenge of measuring something that never happened. So you can't really rely on activity anymore. And this is, this is a significant challenge because we have to think about things differently. Now you have to start thinking about engagement. You know, how are people engaging with your content? Are they actually spending the time on the page to read your answer? Are they asking follow-up questions? You know, and and and try to really start tying the journey together. And this is the type of discussions that we're having right now with with our customers. Is starting to go beyond just the discussion of self-service success and deflection. And I'm not gonna kid anybody It is a challenge. It is a challenge to to to talk people, to go and think beyond case deflection and really start thinking about being higher customer experience. So how are you gonna set up your analytics to measure that end to end journey to really understand how your customers are navigating through your different, digital properties how they're engaging with your your content, how they are actually using this type of and what are the impacts of this? And then all these analytics should allow you to identify where the points are. So take an example. If you have three or four different digital properties like documentation side, a community, and your case, in case submission page, you need to be able to stitch that journey and understand, okay, they came in through the docs site, and, they just clicked on a few things. And they went to create a a a case. Okay. That's one journey. But let's say they came in and they actually engage with your content, went through the generative experience and did not submit a case. Okay. Well, that's another journey. But you need to be able to stitch that. And The other point is really around being able to measure that customer experience and how your customer sat is gonna is gonna vary. In theory, if you're removing friction from your experience, your customer effort scores will be better. But then is, how does that correlate to growth and expansion? And that's really the key here because a lot of support organizations are, are, are going through that transition of saying, okay, don't wanna see ourselves as a cost center. We actually wanna, you know, prove how we're contributing to the growth and success of the company. You need to be able to tie those two together. So this is what we're hearing from our customers. This is what, you know, we're we're we're hearing, and it it is a it is a challenge to, you know, just rethink how we we think about the, the analytics around self-service because we've been into the, the case deflection mindset for a long time. And I know John and I have had a few exchanges around that. But it's definitely something that is is gonna transform how we think about the industry, and I really think that it's it's something that's really exciting because now we're gonna be able to position how these technologies actually add value not just from a cost containment side in the house, but also how it contributes to growing and standing your, your own customers, just removing friction, your adding stickiness, your customer will go up. And definitely there's a clear correlation between, you know, satisfied customers and your ability to to grow and expand these customers. So I'll leave you on on this. Before we go to the questions, we, we talked a little bit about the impacts of these of these technologies. Our our first customer to go live with this, with with our beta feature is zero. And as they were going through their and their AB testing and and doing all that. They actually tested for a few weeks, Kaveo without GeneI and Kaveo with genAI. And within one week's time, they saw an improvement of twenty one sent on implicit case deflection. So the way they measured it was they they brought this to they were measuring cases per thousand searches. You can see here on the third line there was significant decrease when JNAI was used. Same thing for the time on the search page and the session duration was actually, you know, six percent higher. Of course, these results were preliminary when they initially came out, they were based on an AB past where fifty percent of the traffic was going through one experience in the other fifty in the other experience, but definitely significant. If you kind of project these numbers, to your own support centers, your Woodwood twenty one percent of case deflection or sort of success mean to you. You know, I can say it's it's quite significant. So That's what I had for you guys today, and we can quickly move to, the questions. I did see that there's a few coming in to, to the chat. So Vanessa, if you open that up. Or, John, if you have any, have any thoughts or comments on what we just shared? Well, I think, the the most common thing that we're hearing, you mentioned garbage and garbage out. And, you know, most companies have not cleaned up their content store. Or audited their case notes for a very long time. I do think it's interesting. One of the other, Gen AI features that we're seeing is the ability to create the case notes at the end of the interaction. So I'm hoping moving word that we'll start to see much more accurate case notes, complete case notes, because, you know, half the time, I don't think they get very well documented at all. So, you know, maybe that will help moving forward. But, the last time we surveyed members about knowledge management maintenance, about a third said that they hadn't cleaned up their knowledge base for a very long time. I suspect the actual number is much higher than that. So this is a really good driver, for, for you to kind of go do some, some cleanup. And if you're using a product like Coveo, you've data on what knowledge is useful and what knowledge is never being used at all. So it's it's not gonna be a completely manual process, but I think that's certainly a a great place to start. Yeah. It's interesting that you bring up the case notes because we We've had customers who wanted to bring this information within their content to generate answers. And I was kind of, hesitant to tell them to do so because if you're using case notes that are six, nine, twelve months old, you probably resolve these things in your product, but you're leveraging them to to generate answers. It's it's, you know, when we think about the quality and the the, the see of content and making it actual, you really have to think this through because, you know, the the LMM is not gonna understand what is what is true and what is not. It's just gonna be generating an answer based on what you're feeding it. It was important to to have all this layer of relevance that's gonna be able to identify, you know, when is, when it's relevant and generate the right answer for your customers. Okay. So with that, just as a reminder to everyone, if you'd like to submit a question for Patrick or John, please enter in the ask a question box without the left corner of the webinar player. And we're going to get through as many questions as we can. I'm actually gonna start with a question from Lynette, and they ask Have you found that users automatically approach the intent box more conversationally as opposed to keyword searching or have companies had to train customers to search differently? That's a good question. And definitely something that our our customers are are thinking about because it is a behavior shift. And, you know, what what we're seeing and what we're expecting is, you know, it's first. It really depends on, you know, it sounds might sound very, very, logical, but the size of your search box is important. If it's like this big and this log, people will have a tendency to write keyword searches and and leave it then that. But if you put a a bigger box and the question you put in in there will also help. We're gonna see that, you know, behavior kind of shift. But I think through time, what we're gonna see is that people are used to keyword searches, but at least as they start to see these generative experiences, they're gonna be more accustomed to, you know, asking the right ions and and kind of transitioning from the keyword search that they're used to to asking questions. And, I think this is not just true for, you know, you know, your own customers, but as they get or accustomed to leveraging tools like Chad GPTini or Bing or Bard. It's just being gonna become very natural to ask ask things or search in, in, you know, question based. So I think it's gonna take time, but definitely as these experiences are are more frequent. We believe in what we're seeing is that the behavior will shift to being more conversational. Nature? I I agree completely. I mean, I've heard from Coveo and a lot of other search providers that the biggest problem they have is single word searching. And I think the fact that people are experimenting so much with GPT and the other tools they're just going to be more conversational in the questions. So I'm hoping that that is automatically gonna happen. Because it's always been a handicap when you only get a single word search. Yep. It's totally. Alright. Our next question is gonna come from Paul, and they say what's the difference between training your own LLM or large language model? Verse leveraging a technology like yours. Good question. Not the first one to to ask. But, I think the the key difference, if I'll go if we go back to the slide, excuse me, where, you know, you had the left hand side where you have Kavell and you the right hand side around training your own LM. Training your own LM means that you will feed it your information within the information it was already trained on. So let's say you're using an, you know, a version of OpenAI. It has already been trained on the internet. Now you're feeding it your own information. It's really it's gonna be more challenging to ground the prompt and make sure that it's only answering based on your own content and that piece. It's not leveraging the information that is that it was trained on, which could increase the chance of hallucinations versus using you know, the combined infrastructure that that we showed is that you're actually leveraging your search technology to identify the most relevant snippets within your documentation. And those snippets are being built built into the prompt and you're asking the LLM to generate an answer solely based on this information. So it's just like, I'll use the example. Like, if I go to at GPT and I'd say who founded Kubeo, you know, thirty percent of the time he's gonna they're gonna get it right, and they're gonna make names and things like that because they miss, they haven't really been trained on that. But if I feed it the wiki page from Coveo, and I get the same question after, you know, a hundred percent of the time they're gonna get it right because you've grounded that problem. You've gave them the information through them. So all about the example. And the second benefit, I would say, is that training in LLM is very costly. So depending on your content strategy and what rhythm you're creating new content. If you're creating new knowledge articles, hundreds of times per day or per week, you know, training your LLM on this every single time will require a lot of first off a lot of time, a lot of money. So there's a lot of of cost, related to the training piece versus if you're leveraging your search, we're already doing that. You know, search service providers already indexing information, almost real time, if you wanted to, So you're always leveraging the most actual, the most factual content every single time. So I think that's a that's a key thing to take into consideration as well. Okay. Our next question here is from Daniel. And they say, do you see GenAI being used to provide consultancy services, or is it more limited to customer support? That's a good question, from what leads been working on with our customers. It's mostly around customer support because that's what the their documentation is is aimed Right? It's been focused solely or mostly on, you know, self-service and and to to enable customers to find their ads and resolve their issues on their own. So since that's the that's the the prior that's the the main reason why for their documents, they're gonna use it mostly for that. But if you have documentation that is more consultancy in nature. You can definitely do that. Like, you know, some of the things that we're we're looking into is is troubleshooting assistance. Right. So, when you look at, you know, how an agent could, yes, get an answer to the case that they're working on, but if they're looking for help on troubleshooting this stuff, then you have a lot of internal documentation that covers this. You can definitely use it in that, in that sense. But the majority of our customers, right now, they're looking at it for for customer support mostly, but it could definitely be used in, you know, troubleshooting assistance or any type of consultancy, assuming you have the content that can help you do Yeah. I I can add to that from like the professional services perspective. All of the professional services automation vendors are looking at GenAI for proposal automation, being able to look at, for example, why people are buying a product or why they're engaging you for a project and factoring in their desired outcomes, into creating that implementation plan or proposal. Definitely some fabulous opportunities on automating, the billing and the reporting that goes to customers on on consulting projects. So, you know, I think the sky is really the limit on anything customer facing we're seeing it used for customer onboarding plans, even obviously for customer marketing, generation for renewals and expand selling campaign. So really any stop along, the customer journey, there are some great use use cases for for GenAI. So supports an obvious place to start because of the high transaction volume and the amount of content you already have. But, you know, Hopefully, we will see as we saw fifty four percent of companies are looking across enterprise use cases, not just support. Okay. Well, thank you both so much, and we have come to the conclusion of today's live webinar. So now just a couple of quick reminders before we sign off for today. There will be an exit survey at the end if you could take a few minutes to provide your feedback on the content and your experience by filling out that survey and know that a link to the recorded version of today's webinar will be sent out within the next twenty four hours. I'd now like to take this time to thank our presenters, John and Patrick, for delivering an outstanding session. And thank you to everyone for taking the time out of your busy schedules to join us for today's live webinar lessons learned from search to generative answering, brought to you by technology and services industry association, and sponsored by Cadeo. We look forward to seeing you at our next TSIA webinar. Take care, everyone.
S'inscrire pour regarder la vidéo
TSIA Lessons Learned | From Search to Generative Answering

Patrick Martin
Vice Président Exécutif, Expérience Client, Coveo

John Ragsdale
VP Recherche, technologie et social, TSIA, TSIA
Next
Next
