Enterprise AI announcements have a familiar arc. The launch is loud. The demo is impressive. And then, somewhere between proof of concept and production reality, things get quiet. Not a dramatic failure — just a slow retreat from something that worked beautifully until it had to work every day. The gap between what AI appears capable of and what it actually delivers under real conditions has a name: the Wow-Trust Gap.

Closing that gap isn’t about finding a better model. It’s about building the right foundation underneath it — connecting systems, governing knowledge, tuning retrieval, and thinking hard about exactly where in a workflow an AI response would actually change what someone does next. That work is less exciting than the launch announcement. It’s also the work that determines whether any of it sticks.

This year’s Innovator Award winners did that work. A workforce platform, a data intelligence company, and an enterprise software leader — each building AI into the specific moment when someone needed it most, grounded in knowledge that made the response worth acting on.

UKG: Relevance Before Generation, at the Speed of Voice

Voice is an unforgiving medium for AI. There’s no loading spinner to hide behind, no suggested search to fall back on, no moment where the user can pause and reframe the question. When a customer calls with a complex HR or payroll issue, the response has to be fast, accurate, and conversational — or the experience falls apart.

UKG designed its Intelligent Virtual Assistant with a clear understanding that delivering great customer experiences requires more than generative AI alone. Their solution pairs Sierra.ai’s conversational voice platform with Coveo’s Passage Retrieval API, so that every spoken question is grounded in the most relevant passages across more than 60,000 pieces of UKG documentation before any response is generated. This “relevance first, generation second” approach helps deliver accurate, timely answers — which matters enormously when customers are seeking immediate support. 

The results reflect real production scale: thousands of customer interactions handled daily through voice channels, with many callers getting what they need without the need for a live agent. When escalation is needed, full context is preserved in Salesforce so live agents can pick up without missing a beat. It’s AI that meets customers where they are — on the phone — and delivers fast, reliable support they expect. 

Relevant reading: The Wow–Trust Gap: Why AI Breaks in Production

Alation: Meeting Users in the Moment of Need

Alation’s support team noticed something worth paying attention to: declining help center traffic. Not because customers needed less help. As seen across the digital landscape, customers are becoming increasingly comfortable asking their AI tool of choice for help. The old model of “go to the portal, search, find an answer” was losing ground to something faster and more immediate.

Instead of resisting change, Alation’s team followed their users. They expanded Coveo across every surface where a customer or employee might need an answer — the help center, the community site, the corporate site, and directly inside the Alation product via an in-product experience and a new chatbot called Alamigo, built on Coveo’s Passage Retrieval API. Alamigo goes one step further: when an error message surfaces while a system admin is working, it’s automatically sent to Alamigo as a query. The user doesn’t have to ask. The help arrives.

For employees, the team connected Coveo’s MCP Server to ChatGPT — letting internal teams ask natural language questions and get answers drawn from Alation’s own knowledge, with security and governance intact. The results are measurable: year over year, more customers are finding answers on their own, creating fewer cases, and resolving issues without ever needing to reach support. The majority of successful customer engagements now happen entirely outside of cases — a direct reflection of how many more ways there are to get help.

BMC: 10 Years of Knowledge, Finally Accessible

BMC had something most companies would envy: more than a decade of institutional knowledge in the form of product documentation, knowledge articles, compatibility matrices, internal wikis, and engineering insights. The problem was that getting to the right piece of it still took too long. Engineers manually cross-referenced documents. Services teams hunted across siloed systems. The knowledge existed; the path to it was the bottleneck.

As an early closed-beta adopter of Coveo’s MCP Server, BMC embedded Coveo’s relevance engine directly into Microsoft Copilot Studio, creating specialized agents for their Customer Success, Professional Services, and R&D teams. The highest-impact use case is log analysis: engineers now upload error logs directly into Copilot, which parses them, queries Coveo-indexed knowledge articles, and returns targeted troubleshooting guidance in seconds. What previously required manually opening and cross-referencing multiple documents now happens in a single interaction.

The solution also tackles a subtler problem: version accuracy. BMC worked with Coveo to implement metadata-driven version boosting so that generic queries default to the latest supported product version — preventing outdated documentation from surfacing during live customer engagements. The team moved from proof of concept to daily production usage within weeks, with strong satisfaction reported across the teams using it daily. The knowledge was always there. Now it shows up exactly when it’s needed.

Relevant reading: 5 Best Practices for Multichannel Knowledge Management

Hear Their Stories at Relevance 360

Want to learn more about this year’s Relevance Awards winners and the work behind these implementations? Explore the full stories at R360.

Relevant viewing
Trust is Currency in the Agentic Era