For the past few months we have been showing customers how Hello Customer connects to Claude. Two pieces make it work. An MCP server gives the model live access to the CX data — scores, key drivers, sentiment, sampled feedback. A set of Skills sits on top: small bundles of instructions and scripts that the model picks up automatically when a question calls for them. The combination turns "prepare the QBR for this client" or "give me the retention story for last quarter" into a working request rather than a project.
After every conversation where we demonstrate this, one question comes back: can we do the same thing with Microsoft Copilot, Gemini, or ChatGPT?
The honest answer has two parts, and they follow different logic. Mixing them up leads to promises that don't hold up. To answer the question well, you have to separate two layers: the data integration through MCP, and the Skills that sit on top.
MCP — Model Context Protocol — is not Claude technology in any narrow sense. Anthropic published it as an open protocol in November 2024, and the major players have adopted it since. OpenAI announced MCP support for ChatGPT and the Agents SDK in March 2025. Google did the same for Gemini. Microsoft built it into Copilot Studio and keeps extending the rest of the Copilot family.
What that means in practice: our Hello Customer MCP server can, in principle, be addressed by any of these platforms. The connection between the model and the CX data is not tied to a single vendor. That is the point of an open protocol. Today the integration runs stably in production with Claude; with the others, it depends on how far each platform's MCP implementation has come.
This part works differently. Skills as we use them are a Claude concept: a folder containing a markdown instruction file, optional scripts, and a short description that lets the model decide when the Skill is relevant. Three properties matter. The model activates the Skill automatically based on the question. The content loads only when needed. And executable code can sit inside.
Every other platform has its own version of the idea. None map one-to-one, but the underlying material — instructions, workflows, domain logic — travels. It takes work, not a rebuild from scratch.
Two things follow from all of this.
The data integration with Hello Customer is independent of the model. Through MCP we are ready to work with Claude, ChatGPT, Gemini, and Copilot. The production-grade integration today sits with Claude, and the rest will follow as their MCP implementations mature.
The Skills we have built are tuned for Claude. Making the underlying logic portable to other platforms is a separate track, and one we are actively investigating — what carries cleanly across, what needs to be reshaped per platform, and where automatic activation can be approximated when it isn't native.
Claude is the most mature environment for this style of work right now. That is why we started there, and why our first wave of experiences lives there. The other platforms are catching up. When the moment comes to move, the substance — the workflows, the domain knowledge, the integration through MCP — travels with us.
Subscribe to the Hello Customer newsletter and receive the latest industry insights, interesting resources and other updates.