In the fall of 2025, Walmart and OpenAI announced a partnership that was supposed to change how people shop. Customers could chat their way through a grocery list, restock essentials, or discover new products, all inside ChatGPT, no browser tab required. 

Cut to five months later and Walmart pulled the plug. OpenAI’s Instant Checkout had struggled with accuracy and was unable to match Walmart’s internal shopping tools, sending conversion rates well below what Walmart typically sees through its own channels. So Walmart did what any retailer with a homegrown AI assistant would do: they released their own ChatGPT app instead, making it possible to shop within the ChatGPT interface, on their terms.

They’re not alone. Target, Instacart, Shopify, and Etsy have all made the same call.The lesson isn’t that conversational commerce doesn’t work, but that handing the keys to a third party doesn’t either. Shoppers want the convenience of ChatGPT and brands want ownership of what those shoppers actually see; accurate inventory, correct pricing, and a brand experience they control. A ChatGPT app is how you can have both.

Have the ChatGPT Cake and Own It Too

Shoppers are already using AI assistants to research products, compare options, and ask for recommendations. That behavior isn’t waiting for brands to catch up. The problem is that without a direct catalog connection, brands have no say in how they appear in those conversations. ChatGPT scrapes what it can find. Prices may be stale, products may be discontinued, and your brand voice is nowhere in the picture. That’s a control, not a visibility problem. 

A ChatGPT app solves it. Instead of hoping a third-party integration represents your catalog accurately, you plug your data in directly. You decide what’s surfaced, how it’s presented, and where the shopper lands. You get the reach of one of the fastest-growing consumer interfaces in the world without surrendering ownership of the experience.

The good news is that the technical lift to build an app is smaller than it appears, especially if you’re already running Coveo. Here’s how to build it.

How to Build it: The Technical Foundation

The key enabler here is the Model Context Protocol (MCP) — an open standard that lets AI clients like ChatGPT talk to external data sources. Think of it like a USB standard for AI: a common protocol that any AI client can use to connect to any data source, without custom one-off integrations for each pairing.

In practice, you build a small server, an MCP server, that sits between ChatGPT and Coveo. When a shopper asks a product question in ChatGPT, the request hits your MCP server, which translates it into a structured Coveo Commerce API query, and returns normalized product results. ChatGPT uses those results to compose a conversational answer.

If you’re already using Coveo, the heavy lifting tasks including ranking, merchandising, faceted filtering, are already handled. You’re wiring together what you already have.

The Playbook

What you’ll need before you start

Before writing any code, make sure you have:

  • A Coveo organization with your product catalog indexed
  • A Coveo API key with query permissions on your sources
  • A ChatGPT Plus, Teams, or Enterprise subscription
  • Node.js 18+ and basic TypeScript familiarity

Step 1: Build the MCP server

The MCP server acts as a bridge between ChatGPT and Coveo. It translates natural-language queries into structured Coveo API calls and returns normalized results.

You expose a single tool, commerce_search_coveo,  that ChatGPT knows to call whenever a shopper asks a product question. The server runs in stateless mode, meaning each request is fully independent. That’s an important design decision: stateless mode eliminates session-management complexity and makes horizontal scaling straightforward — no sticky sessions, no shared state to coordinate.

The server also includes a tool description that teaches ChatGPT when and how to call it, including how to pass structured filters like price ranges and brand preferences as facets, rather than embedding them awkwardly in the query string.

Because Coveo indexes more than just product data, this approach extends beyond commerce: you can define tools for Coveo’s Search and Passage Retrieval APIs as well, enabling experiences that go from product discovery to support and self-service. This is a key differentiator for Coveo, allowing a single integration to power the entire end-to-end customer experience.

Step 2: Handle real shopper language

Here’s the core problem this step solves: shoppers don’t search the way search boxes expect them to. They say things like “I need a tent for two people, nothing over $300.” A traditional keyword search would choke on that. Your MCP server needs to extract the structured intent: price ceiling, category, and quantity, and pass those as proper Coveo facets.

The approach is two-layered. The tool’s input schema tells ChatGPT exactly what parameters to pass, so the LLM does the heavy lifting on complex constraints and passes them as structured facet objects (e.g., { type: “range”, field: “ec_price”, max: 300 }). On the server side, a regex-based budget extractor runs as a fallback — catching price mentions that the LLM might pass through in the raw query string rather than as a structured filter.

The result: a shopper’s natural language reliably maps to precise Coveo Commerce API queries, with the right products surfaced regardless of how the constraint was expressed.

Step 3: Add a product card UI

By default, ChatGPT presents tool results as text. Step 3 changes that and it’s where the experience goes from functional to genuinely compelling.

You register an HTML resource alongside your tool that ChatGPT renders in an iframe next to the conversation. That’s where your product cards live: images, prices, brand names, and direct links to product detail pages.

The UI is a self-contained React app compiled to a single HTML file, served as an MCP resource with a specific MIME type that tells ChatGPT to render it as an interactive app. Because this UI is fully defined by the business, it can be customized to match brand guidelines, merchandising strategies, and UX patterns—giving teams complete control over the shopping experience rather than relying on a generic interface.

This step is technically optional. ChatGPT will still surface results as text without it. But if you have a live catalog with product images, skipping the UI leaves significant experience quality on the table. This is what makes the channel feel like shopping, not just querying.

Step 4: Connect to ChatGPT

Once your server is running and publicly accessible over HTTPS, you register it in ChatGPT under Settings > Apps & Connectors. ChatGPT discovers your tools automatically from the server with no additional configuration needed.

From there, shoppers add the App to any conversation and start querying your catalog directly. You can also package this as a custom GPT  with a brand name, icon, and persistent system instructions for a more polished, branded experience that can be shared via the GPT Store. Either way, the MCP server connection works identically.

Step 5: Test the full loop

Run through a set of realistic shopper queries that cover the full range of what your MCP server needs to handle:

  • A broad category search (“Show me women’s running shoes”)
  • A budget-constrained request (“Hiking boots under $200”)
  • A brand-specific question (“What do you have from Patagonia?”)
  • A follow-up refinement (“Show me just the three cheapest”)

For each, verify that products are returning from your actual catalog with correct pricing and images, and that clicking through lands on the right product detail pages.

Start Here, Get Ahead

The infrastructure for conversational commerce is no longer experimental. Major retailers have moved, the protocol is standardized, and ChatGPT’s App ecosystem is growing. The brands that build a direct catalog connection now will have a head start that’s genuinely difficult to close later because relevance compounds. Every query your catalog answers well is data that improves future ranking.

The technical lift, as you’ve seen, is smaller than it looks, particularly if Coveo is already powering your search. What you’re really building is a translation layer: from natural language to structured commerce queries, and back to a shopping experience that feels native to the channel.

For the full implementation, including code samples, schema definitions, and configuration details, see the step-by-step guide in the Coveo documentation.

Dig Deeper
Read the full implementation guide.