Most internal data catalogs die the same way: someone builds a careful dictionary of dashboards and definitions, it goes stale, and people go back to pinging the team on Slack anyway.

The Coveo Product Intelligence team had a version of this problem. They maintained a rich set of analytics assets — dashboards, reports, semantic definitions — but discoverability was still a bottleneck. Users didn’t know what existed or where to look, so they asked a person instead.

When the Coveo MCP Server launched, it offered a different approach: connect a conversational interface directly to your indexed content, with minimal infrastructure. We used it to build a lightweight discovery tool on top of our existing assets — no complex agentic layer, no custom LLM fine-tuning, just a push source, a query pipeline, and a well-crafted prompt.

Here’s how we built it, and what we learned.

Building Product Intelligence Discovery

Our initial experiments quickly showed promising results.

We started with a simple setup: a custom script that extracted well-contextualized data from our reporting tool’s API, applied light transformations, and pushed the result to a Coveo Push source. A critical step was shaping each item’s body to carry enough semantic context for retrieval.

coveo_formatted_data = [
    {
        "objecttype": "Card",
        "documentId": f"ANALYTICS_TOOL_URL/question/{row.get('id','')}",
        "title": ETLFormatter.format_column(row, "name"),
        "collection": ETLFormatter.format_column(row, "collection_name"),
        "description": ETLFormatter.format_column(row, "description"),
        "display": ETLFormatter.format_column(row, "display"),
        "source_table": ETLFormatter.format_column(row, "table_name"),
        "distinct_user_count_last_6_months": ETLFormatter.format_column(
            row, "distinct_user_count_last_6_months"
        ),
        "view_count_last_6_months": ETLFormatter.format_column(
            row, "view_count_last_6_months"
        ),
        "language": "English",
        "compressedBinaryData": ETLFormatter.encode_body(
            f"""
            Card: {ETLFormatter.format_column(row, 'name')}
            Collection: {ETLFormatter.format_column(row, 'collection_name')}
            Description: {ETLFormatter.format_column(row, 'description')}
            Display: {ETLFormatter.format_column(row, 'display')}
            Source Table: {ETLFormatter.format_column(row, 'table_name')}
            Distinct Users (Last 6 Months): {ETLFormatter.format_column(row,'distinct_user_count_last_6_months')}
            View Count (Last 6 Months): {ETLFormatter.format_column(
row,'view_count_last_6_months')}
        """
        ),
        "compressionType": "gzip"
    }
]

On top of this, we used a query pipeline combined with a semantic encoder model to surface highly relevant results. This part of the architecture was intentionally simple, with effort focused on curating high-quality content in the ingestion step.

From there, we connected the Coveo MCP Server to this pipeline through a custom GPT in ChatGPT Enterprise. This made it easy to distribute the experience across the organization, allowing users to install and use it with minimal friction.

Finally, a well-crafted prompt tied everything together. Generated directly within ChatGPT, the prompt and the ingestion script delivered quality responses. No complex agentic layer was required. 

Users interacted naturally with ChatGPT, which in turn issued queries to Coveo, grounding the experience with high relevance and consistently providing strong results.

Enhanced User Experience

With the Product Intelligence Discovery tool, the user experience for data exploration has been transformed:

  • Natural Questions: Users can now ask natural questions like, “Where can I find the weekly consumption metrics?” or “Do we have data about push source adoption?”
  • Quick Discovery: They quickly discover relevant assets and links without navigating multiple tools or asking a colleague.
  • Shareable Conversations: If they still need to reach out for deeper context, they can share their conversation and the discovered assets, ensuring the conversation starts with all the necessary context.

A Foundation for Scale

The Product Intelligence Discovery application shows how far a simple experiment can go when built on the right foundation. With a lightweight setup and minimal moving parts, the Coveo MCP Server makes it easy to connect high-quality content to a conversational interface and start delivering value quickly.

If you’re exploring where to start with conversational AI, this is a practical place to begin. Focus on curating the right content, keep the architecture simple, and iterate on the experience. With the right foundation in place, you can grow from a small, targeted use case into something that supports your organization more broadly.

Learn more about the possibilities with Coveo on our docs site:

Relevant reading
Coveo documentation