Hello, everyone, and welcome to today's webinar. F5s Gen AI Journey from POC to Production and Beyond, brought to you by TSIA and sponsored by Coveo. My name is Vanessa Lazaro, and I'll be your moderator for today. I would now like to introduce our presenters. John Ragsdale, distinguished researcher, vice president of technology ecosystems for TSIA, Patrick Martin, executive vice president global customer experience for Kadeo, and Laurel Portner, senior director of digital services for f five. As with all of our TSA webinars, we do have a lot of exciting content to cover in the next forty five minutes. So let's jump right in and get started. John, over to you. Well, thank you, Vanessa. Hello, everyone, and welcome to today's webinar. Something that I hear from all of our TSIA members is stop talking about the potential for AI and give us some actual use cases. How are people using it? How did the the project go? What sort of ROI are they seeing? And I'm very happy that we have Laurel with us today from f I. And I'm very happy that we have Laurel with us today from f five. F five is always a pacesetter. They're always the first one to try everything. You guys throw anything at the wall to see what sticks, and I'm thrilled to have you here today to share your story. And, of course, Patrick Martin, longtime TSI member and partner, all around good guy, frequent webinar participants, so very happy to have you both with us today. We're going to be talking about self-service, and I think that support organizations often get so focused on calculating the ROI of deflection and how it's going to save money that they lose track of the fact that self-service is really the experience that customers prefer. And according to the channel preference study that we did, seventy five percent of customers say that they prefer or occasionally use self-service for support. But if we look at the success rate for self-service from our benchmark, the industry average is only fifty percent, which means half the time the customer comes to your website, they're not getting the information they need. And that means they're either going to leave in frustration or it's going to generate an assistant support interaction for you. So definitely, GenAI has the capability of boosting that self-service success rate, and we're going to see a demo of how that works today. So, obviously, support with their, huge volume of issues and a lot of repetitive issues, has a experimenting with GenAI for customers. This survey was from last year. I We're experimenting with Gen AI for customers. This survey was from last year. I suspect that that number today is quite a bit higher. But something to keep in mind is the data on the right shows that there are a lot of AI projects going on within enterprises today, and it's important that you pay attention to this case study and understand really what the potential is, what the ROI is, because the more of a business case you can make for that project, the better you're going to be at lobbying IT to prioritize your project against all of these other projects that they are handling. And some of those, let's be honest, it's because, executives have an MBO to take something live on AI. And if you've got a proven ROI model, you're probably going to be ahead of a lot of those those folks. So, Coveo is our sponsor today, and I think that, for those of you who, you know, really understand intelligent search, analytic based searching, when we survey our members on where they're going to go shopping for generative AI, eighty one percent say they're looking to their intelligence search vendor. And this makes sense because you've spent years developing these analytics of understanding what people are searching for, what content best answers those questions. And if you're bringing in Gen AI and putting on top of that existing data structure and all of those analytics, you're gonna get to success a lot faster. And it seems like every week I hear of a new Gen AI tool somebody's built in their garage. And, you know, if you're starting from scratch and training that large language model, with no previous context, it's gonna take you a lot longer, to get to success. So, hopefully, you're going to see some of the benefits today of working with your existing unified search vendor, to really introduce GenAI in a much shorter time frame. So my final slide, the potential Kilometers program, from Gen AI. But as I said, what everybody is asking us for is how do we go from, concept to reality, and I'm very happy to have a real world case study for you today. Things you need to be thinking about, introducing employee versus customer facing Gen AI, how do you approach that, understanding the business impact again so you can build that business case with IT, and really better understanding the security and governance issues as business users so you can partner with IT to overcome, their concerns, which sometimes are, I I think, a little inflated. And with that, I would like to turn things over to our guest speakers today. Patrick, over to you. Thanks so much, John. And, once again, always a pleasure to, drive one of these webinars with you, and it's even more a pleasure to be with, with Laurel from from f five. We've we've done a few things together, and I'll take the time to introduce Laurel because I think it's it's good to put her in a spotlight because she she is extremely well known in the knowledge management space, definitely seen as an expert. And you can see here, you know, being a board member with the consortium for service innovation, recognized for those contributions by receiving an innovator award. She she was with Coveo when I joined six years ago, so she is a next colleague. So, definitely very happy to share this time with with Laurel and and and, you know, walk through that that experience, of of implementing Gen AI at f five. So, Laurel, I'll I'll turn it to you for a quick f five overview, and then we'll get into the the the meat of the topic today and and talk about your, your process here. Absolutely. Thank you both, and so happy to be here. And, yeah, let me just tell you a quick overview on on f five. I've I've been here for over five years, and, you can go to the next slide. F5 is committed to building and bringing a better digital world to life, and we've been in business for over twenty five years. And we secure and deliver nearly half the world's apps. So we've got over six thousand employees. I'm actually based here in Seattle and at corporate headquarters, but we have customers around the globe. And, and so it's it's been a a great company to to, you know, start a a knowledge management journey, and bring knowledge, into the support world. And our support organization is over seven hundred, people and, again, across the globe. So so if you wanna go to the next slide, I'll give a quick overview of our portfolio. In in securing and delivering every app, we've got three main, product lines, our traditional, big IP for for traditional apps, hardware, and, and package software, and then NGINX for microservices and app services, and then the f five distributed cloud for our SaaS operations. So we have a lot of products as you can see here and which requires a lot of knowledge. And so happy to go to the next slide and and talk a little bit about our journey. And so as with a lot of large enterprises, we have all sorts of different web properties and use cases. And so these are these are are the use cases that we use Coveo, enterprise wide. So our corporate website is run on on Kaleo, our community, which is DevCentral. Maya five is our ownership experience, our support portal, and also where we drive trials of our products, labs where customers and partners can, can test out and and use products, and then several documents sites as well. Plus, we'll get a a glimpse into the the support agent experience. And I'm happy to happy to walk you through that. Great. Thanks for that. So now let's, let's go into a a few questions that I have for you around the the vendor selection process as well as how you went through your proof of concepts and your selection. So as, as John stated coming into the webinar, you know, seems like everybody has a a GenAI offering. So there's a pretty huge diversity of vendors out there saying that they they can actually deliver on this. But I would be curious to know where you started your selection process, and and and why you went down that path. Sure. Yeah. So I I think that there's probably a lot of you out there that, can empathize with the hype cycle of AI and, what it looked like a year ago is very different than now. And even six months ago, it was just this this floodgate and trying to to understand what's all out there, what do we have, what are use cases. I think that that's really where we started is identifying those use cases and and then determining what we had at our fingertips, and also the level of effort that it would take to put something into production. So, we we saw that, what Coveo had and several other vendors, and, and we put together a proof of concept. And we gathered about twenty five different subject matter experts within the services and support organization. And and we really just said, help us. And, so it it was pretty fascinating to watch how this all, you know, unfolded. And so, I mean, to the fact that they were building test scripts, they were they were trying to determine how we're gonna measure this, how are we going to be more objective. And, so so when we did some of that, things started to emerge. And and, again, we had to build business cases just like, just like anything and and have ROIs. So a lot of it actually was, what do we have right now? We need to make some decisions in the next thirty days, for example. And, so a lot of it was time based and and then what we had, already in front of us that we could say, we think this is the right way to go to to get our foot in the door. Well, I mean, it sounds like a a lot of work definitely and and especially trying to decipher the real stuff from the noise because there was a lot of noise if you go back, like, six to twelve months ago. So going back on your first proof of concept, because I know you you did several on different different use cases. Did you have a specific use case that was, like, top priority for you? And is was there a specific reason why this was the top priority use case? Absolutely. So we were very focused on efficiency. I think a lot of executives, are also asking these questions, and so ours were were no different in trying to understand, is it hype? Is it real? And so the, the use case was our internal, agent experience and how can we optimize finding content, finding answers. But we also looked at our customers as well. So because, it's essentially the same, the same view, the same motions, maybe different contexts. But what we found was, we could put a search page in front of our testers, and and we could have that across several different use cases. So what does the customer facing search page look like versus our agents? And so we were able to run our our tests, and things like that kind of agnostic of of that subset of use cases, if you will. And, so that's kinda how we we did that. So our our internal search, experience for our agents and as well as the customer facing one. Yeah. Which seemed like the most two obvious ones. But Yeah. It it's it's surprising to see how many companies, at least, that we see are are going with the efficiency use case out of the gate. And, you know, they they kind of don't see the the the self-service piece as being kind of a little hanging fruit for a lot of different reasons. But, you know, definitely, once you start exploring, it becomes it becomes much more, of of an interesting place to to look at because the impacts can be quite significant. So as you went through this, I'm sure you've had, you know, tons of learnings and, you've uncovered different things, but I'm curious to to see. Did you notice, differences on the maturity of Gen AI capabilities or or limitations that you'd have to, you know, to live with because, you know, we're not kidding ourselves here. These things are still rather new. We're kind of Right. Twelve to eighteen months still in and things go very quick. But it's it's still new in terms of of, you know, technology and capabilities. Yeah. We we learned a ton. And I think one of the things that you touched on was that a lot of people aren't really focused on the the customer facing, the self-service side, and a lot of it is risk. And I think that that's that's something that that we were looking at as well. We're a security company. We did not wanna put something out in production in front of our customers, that that would add risk to the organization. So that was very important to to us. Our employees are no different. We want to make sure they have access to accurate answers. And so literally all of our testing was around accuracy, was around risk, and trying to measure some of that so that we could understand, you know, when we're bringing this and we're taking this big bet. You know? What is what is the risk so that we can articulate that to to our executives who are investing in this. The other thing that that we learned was that it's it's really dependent on humans. And so when we were testing, the testing process changed so much in just the three months that we were actually what is this thing doing? So so even our test scripts had to evolve just in a very short amount of time. And, and I think that our our customers, and users are having those same challenges. And we're seeing some of those those behavior changes in the data, daily. So Yeah. You you bring up a a very valid point is, I think everyone needs to get used to interacting with these new capabilities. Right? People are so used to the good old keyword search. But now that you're getting these these answers, now you can go beyond just the keyword search and actually, you know, base use real live sentences to, you know, ask your question, and it it will be a behavior change. But to be successful at that, you know, I know that you're not the only ones who who understand this. But through the POC process, I'm curious to know exactly when when you realized the importance of having a robust content strategy and knowledge management program. Now don't get me wrong. This is always important, but it seems like it's been it's taken, a greater importance when we're talking about generative experiences. So when did you realize it, and and why is it so important? Oh, we we realized it very early on, and I think that, in looking at LLMs and the technology, we started to realize that, you know, retrieval augmented generation was was what, you know, Coveo was using. And the benefits of that, were very compelling. And it was something that our, you know, knowledge base and, the the content that we had was really highlighting that, it could cite what what it was generating that that response from. And that that added a level of of accuracy. It also a level of confidence, and and that was really what drove us to putting this in front of our customers and the fact that you could see exactly where it was coming from. And, and so I I think that the if you don't have content and especially content that is contextual, that customers are going to understand, they're going to find it. It's you know, they're describing it in those same, the same wording, same context. I think it'll be harder to, to have that. And in fact, we had we use this on different repositories. And the more structured, the more contextual the content, the the more successful that it was. We we definitely saw that in the data. Yeah. Well, that's definitely, I think, some of the conclusions that a lot of companies out there are are, you know, coming to as they're they're going through their their own, you know, testing program is the importance of of content. It's you can't just throw generative answering or Gen AI on top of, you know, your content if you don't have high quality content. You know, we've talked about garbage in, garbage out for years now, but it's it's taken an even greater meaning or it's taken all of its meaning with generative because you're generating something from a corpus of content. And if it's not accurate, don't expect your answers to be accurate. This is not magic. Right? That's right. Okay. So I think this is, probably the last question I have for you on this topic before we we we actually take a look at it. But, how did you how did you come to the conclusion that, you know, one solution or a set of solutions was better for f five than than another? So, I would say that for one, we looked at it as this is the decision we're gonna make at this moment in time. We we know that the the speed that these technologies are changing, we couldn't possibly have something that we we know we were going to have for two years. It was with what we have right now, what is the best thing for us? And, also, we looked at the effort. We looked at risk. We looked at the accuracy. We also made some, assessments on do you buy it? Do you build it? Do you borrow it? And and we looked at all of those. And so, again, in terms of speed to market with, you know, low risk and accuracy, these these solutions, work best for us for at that time. So Good. Well, enough talking about vendor selection, and let's actually take a look at what this this looks like. So, I think we're gonna see a a demo of your your self service site. Am I correct? Yes. This is our Maya five site. Okay. So let's take a look. So here we have the Maya five support portal, and we're entering a a very popular search, query on how to upgrade our our flagship product, which is called the big IP. And, and so you can see that we've got a generated answer that's coming from a couple different, documents that we have. And so our customers have been giving us feedback on that. The the feedback, has been great. And then if that answer doesn't give the the full answer and a customer wants to, spin up a live chat, that we're showing that on the the far right. And so they can ask a question in in the live chat. This is all driven by Salesforce Experience Cloud. And then here we see the the agent experience, and here's where I was talking about. On the right side is Einstein GPT. And what we found is that it's it's good for, a a bit simpler answers and typically a single, article or a a single answer. It does have a little bit, in terms of it's it's easier with a single click where you can post it. Where on the left, what our users have told us is that, a more complex search and especially when you are, summarizing multiple, documents, CRGA is definitely comes out on top on that. And, so you do have to copy the summarized answer and then paste it over, onto the right side. But, again, it's it's just, an extra click, and we do see that our users are taking advantage of that, and they are giving us, a lot of good feedback on how that works. So Perfect. That was pretty nice to see in action. I think you have a few other, screenshots here that you wanna show us on on the on the experience per se. Yes. And if you wanna build that out. So this is, this is really the the generative answer in our in our chat replies. So we've got it set up so that agents can choose, from a a Coveo generative answer on the left and the Einstein recommendations on the right. And there are what we found is that there's there's different use cases and they, they serve different purposes. So I talked a little bit about multiple multiple documents that are being summarized on the left and and then more of a quick answer, on on the right that is just a single click where an agent can post that into the the chat. And then if you go to the next slide, our internal agents can, and build this out, get a generative answer right in the case form, based on what the case is about. And then then they can also prompt it with manual searches at the top and get, different answers and as well as get those citations, at the bottom plus and it it's it's not on the screen, but you still have your traditional search, underneath that as well. Very, very nice. And I think it's a good demonstration of of how Salesforce and Coveo can be complementary and and and work work hand in hand here. Mhmm. Absolutely. So now that's where you've seen it in action, I think, you know, let's let's talk ROI because that's a big big question for a lot of lot of people who are looking at this and, you know, kind of saying, okay. There's a high cost to it, but what's what's the actual ROI? So let's, let's have you walk us through some of the, the the the the impacts that you've seen, since you've started using this technology? Sure. So so the one thing that is is a little bit of a challenge that that we're having, we're still having conversations and learning and, Coveo, customer success team has partnered with us and and is is learning with us and and helping us understand the analytics. So we measure our self-service success as, as ending a visit on our site on on an answer and a resolved answer. And so what we saw was, an eleven percent improvement on that when you incorporated generative answering. And so we're looking at the, the answer rate of the generative answer, as well as the traditional self-service when someone finds a a a piece of content and and clicks on that. So, so we're able to combine those into our self-service success rate and see that that's, that that is climbing. That was based on our AB test. So Coveo makes it really simple to to just run an AB test of and bifurcate the the traffic on your site. And and then you just, you know, look at the dashboards and and get those numbers. So this was really what sealed the deal, to get the investment and and put it into production. And correct me if I'm wrong, but the AB test was, you know, Coveo against Kaveo. Right? It was Coveo That's right. No generative answering and Coveo with generative answering, and you saw an eleven percent improvement in Salesforce success rate. Correct. Great. Well, that's that's, you know, that's impressive, and it's a good place to start. But I think you you also wanna share, as we talked earlier, you know, your your test results and and how you're seeing value there or how you got to that conclusion that you would get an ROI by using this type of of, capability. Yeah. So, again, this is where I was talking about, our testing and and, you know, we really defined this, and, you know, for for this POC, and also learned along the way of you know, we were focused on accuracy. And so it's, you know, how much of the answer could be reused and, and put into in front of our customers either from an assistant support perspective or from self-service. And so this was really, you know, how much time did it save employees and also was it an opportunity that customers would, would, you know, gain value from getting an answer. And, it definitely was was weighed much heavily on on the higher accuracy, and so this played a big role in in making sure that we didn't have a lot of risk when we were putting this into production. Makes sense. And this one to me, I think, is is definitely impressive around how this has flipped. So do you wanna talk to us about that a little bit? Because it is a challenge in in a lot of support organizations is how do you, you know, how do you flip this over from new known to a greater portion of new versus what what's known. So it kinda shows that self-service is working. Right. Exactly. So that's another thing that I was thinking about is, you know, we started this, six years ago and, KCS program, knowledge centered service, started back then, and we were really focused on content and bringing that contextual content to our customers to shift, off of our assisted support to to self solve. And what we saw, not only are we seeing the the self-service success numbers and that ROI on that previous slide, but this is this helps us triangulate and say what we're getting into our our assisted organization is much is new. New issues that we have not solved before. And that is a telltale sign that, customers are finding the answers. They are self solving on the on the web, and, and then they are bringing those those new issues. Obviously, our portfolio has a lot to do with that as well. However, we wouldn't be able to expand and and take on some of those new products and and grow as a business if we weren't if we didn't have this program in place of of knowledge sharing and, and publishing on the web. So it all kind of feeds itself and supports, supports an an AI, you know, program as well. I would also argue that we were able to get capacity from our support organization to help us, learn f five f five, AI and put, you know, generative answering in in front of our customers, because we had some extra capacity as as we were shifting these new I'm sorry, these known answers, to the web. So so it all helps, it all has helped us, in this in this journey. That's that's great. And, you you know me when we start to get into ROI and data. I always wanna double click on some stuff. So I I do have a few questions for you. Totally. That eleven percent improvement in self-service success rate is is impressive, you know, out of the gate, Coveo, we talk a lot about, yes, you you know, we can help at Coveo, we talk a lot about, yes, you you know, we can help drive that that cost reduction through self-service improvement while improving your CSAT. So as you went through going live with generative answering, I'm curious to know, have you seen variations in your CSAT metrics, both on on self-service and assisted? Yes. So and as you know, CSAT, it's it's kind of a lagging indicator. We haven't had this in place for for a a real long time, so we're still definitely looking at that. CSAT, I would say, is is relatively flat for assisted, slightly elevated. But and then on self-service, what's interesting is that we are seeing, a lot of feedback coming in. So and and I'm I'm gonna be very transparent that, we're getting some some very constructive feedback and that on generative answering that some of these, you know, summarizations aren't quite right. And so but what it does is it gives us that feedback so that we can action and and find ways to help tune it as well as some of the things that we found and and you touched on this, Patrick, was behavioral. So customers, are expecting this, you know, the same results out of search with keyword searches, and and that's it doesn't work that way. And so a lot of times, either they're not getting an answer or not quite the the answer when you only put a couple of words in as your prompt. And so what we're doing is we're educating. We are putting out additional content and helping our customers, learn how to do this prompting. I would say that, the big players like Google are and Meta are also doing those things for us when because I see lots of marketing around that. So, so this is just it's so interesting to just see how this unfolds, and, and it'll be interesting to see in the coming months how that really changes in in terms of user behavior. Because I think it'll be, it will impact it a lot. And what what's interesting in what you're saying here is that this is not something that you deliver and you can just check the box and say we're good. You you have to be in that continuous improvement mindset because, you know, this is this needs to continuously evolve because of that constructive feedback that you're getting and the analytics that you have on on the on the customer experience. Right. Yeah. Absolutely. I mean, it's just like knowledge. Knowledge is never static. It's never something that, you know, you have an answer and and that's the answer forever. So you're you're constantly and, again, if you've got a strong knowledge sharing program, knowledge management program that focuses on continuous improvement, then you, your content is going to serve a a lot of that, that as well for your customers. Yes. I I I really wanna drill down a little bit into the evolution of your new versus known ratio over time because investments you've you've put into your your knowledge program and the successes you're seeing on on the self-service. But can you talk a little bit about, you know, what the impact has been on your employee satisfaction and engagement? Because when we look at at the data as to why, you know, people leave support or why, you know, there's high attrition rates in support, a lot of the times, it's because they're not challenged enough because they're they're going they're solving the same things, you know, repetitively all the time, and they don't get enough new stuff coming in. So I'm curious to know, you know, how how has that shifted at f five with along with the new versus one region? Yeah. This is this is a a fun story as well because I I see, you know, employee engagement, you know, we do surveys a couple times a year and to me the numbers aren't aren't as as compelling as really what, what we hear in the themes and the and the verbatims and the feedback. And, I think it's it's shifted over time to, learning. And especially now that we're seeing a lot of, you know, new newer products, newer issues, our employees want they wanna be confident in these products, and they want they want more investment in in learning so that they can be prepared, to to satisfy customers on the assisted side of things. So it's it's shifted in in that respect a little bit away from maybe more tool based, and things like that. However, tools are also important. And so I think that the integrated experience that you saw before is super important to make it easy for your employees to get access to this content and the answers so that they can, they can learn as much as they can, you know, as they are working issues and, and and finding that content that's there. And then, if it's not there because a lot of you know, so now fifty percent of our our cases that are coming in, we likely don't have a lot of content. And so now we need to focus on creating it. And so, that integrated experience to make it easier for them to just say, okay. Here's what I learned. Maybe I collaborated with a teammate, and I'm now going to get it into our knowledge base with a click of a button. And that's those are some of the things some of those themes. And they're they're keeping us honest about, you know, I need I need some of these better integrations and and access to, you know, to more knowledge. Great. Well, that's that's really interesting. And, you know, we know whether when these things happen, you're you're probably gonna see some impact on your operational metrics. Right? So if, yeah, more stuff coming in, more new stuff coming in that's unknown, in theory, it's gonna take more time to resolve. First contact resolution's gonna go down. So that's definitely something to keep in mind as going through this. And I'm looking at the time. I mean, time flies by when you're having fun. Right? Laurel. So, you know, take us home. What's what's next for you in f five? I mean, when we're going looking at the journey that you've gone through, I mean, what's next? Where where are you going with this? Oh, gosh. We have, I would say, you know, somewhere around forty different use cases that we are are now looking at and doing the exact same assessment that we did, you know, for these which ones bubble up to the top, which ones are are going to give us, the most, you know, efficiencies and, you know, biggest bang for our buck. So, I would say that some of the ones that we're now proving out and and we've got some, some teams looking into are summarizing summarizing cases, actually creating that knowledge that I was talking about or at least augmenting and and helping, helping people, you know, get a first draft so that they can get more knowledge published. Those are those are a couple things in the support realm, but we also are looking at this company wide for, you know, development and our professional services organization customer success. I mean, everyone is is looking at, you know, what use cases, would, you know, would be, where they're gonna ask for investment. So Yeah. Definitely lots lots there on the table. Lots of opportunity. Yeah. For sure. Great. Well, I think that's that's what we have for now. So I think, we're we got a few minutes left for questions. Right? Yes. Thanks so much to you both, and we have a ton of questions. I'm we're only gonna be able to squeeze in a few. I'm gonna go ahead and jump right in. Steve asks, what measures did you take to ensure the security and scalability of the generative answering solution across different platforms? Yeah. So I'll I'll touch on the security, side of it. We have, as you can imagine, being a security company, we've we've got all sorts of different assessments. We involved privacy and our IT organization in, making sure that it it met all of our standards prior to putting any of these into production. And I would say that in terms of, the the scalability across the different platforms, again, we're still looking at that. And that's something that we will constantly be looking at and seeing as as vendors come out with with new features, you know, where does it make the most sense to invest and and add? So I would say that those are those are a couple things that we looked at. We've we've got, in terms of measures. I showed the the testing results and then, our self-service success measures. But all of our use cases must have a at least one success measure that we're going to say, you know, how do we know that this is bringing value either, to whatever stakeholder, that that use case is benefiting. Okay. I'm actually gonna squeeze in one last question here. It's from Pernita, and they say, how do you plan to expand the use of generative AI across other digital touch points in your customer journey? Oh, well, so, you know, just looking at ways that we can start to incorporate into our products, I think, is is one place that we're we're looking for expansion. I know our product teams are are hyper focused on, you know, where can we incorporate this and bring this to our customers. But but also we are looking at, you know, in terms of upfront helping our customers find what they need along all touch points of the customer journey. Again, those use cases that I that I mentioned, we have forty in in service and support, but we've got many more for on the buy side, as as well as the on-site of the customer touch point model. So, so it's kind of a we're looking everywhere, is is really the best answer for that. Thank you again to John, Patrick, and Laurel for delivering an outstanding session, and thank you to everyone for taking the time out of your busy schedules to join us for today's webinar, f five's Gen AI Journey from POC to Production and Beyond, Brought to you by TSAA and sponsored by Coveo. We look forward to seeing you at our next webinar. Take care, everyone.
S'inscrire pour regarder la vidéo
F5's GenAI Journey: From POC to Production and Beyond
an On-Demand Webinars video

Laurel Poertner
Senior Director of Digital Services, F5

Patrick Martin
Vice Président Exécutif, Expérience Client, Coveo
Next
Next
