How Anthropic’s MCP Outshined OpenAI’s Function Calling
The Power of Product Design and Distribution in AI
The AI world continues to show that even groundbreaking research depends heavily on strong product design and effective distribution.
OpenAI launched function calling on June 2023, more than one year and a half ago. It was clear to me it was going to transform the way we interact with software—the Adaptive Computing I described back in 2020 when GPT-3 launched. Beside the usual few demos, it didn’t catch fire.
Anthropic MCP (Model Context Protocol) now is a streamlined, modular, reusable take on function calls, yet allowing you to achieve exactly the same as function calling, from a user experience perspective. In other words, you could have built exactly the same product back in 2023—like a natural language app to turn lights on and off—using function calling alone.
Product Design Importance
But MCP’s design wins. Function calling is basically integration work to let my LLM talk to an API. It takes time, it’s something new, and it’s custom-made for my application.
The client-server architecture of MCP allows instead to reuse the function calling capability in any application (people started with Cursor simply because it’s one of the few clients that can interface with MCP servers).
There are now hundreds of MCP servers already available that I can directly use. As a developer, I simply connect my app to an existing MCP server, pass the user query in natural language, and boom! I’ve integrated with a 3rd-party app that extends my own. It’s like what Zapier did for APIs, but simpler and way more adaptive thanks to the LLM driving it.
The real breakthrough is this: I don’t need to master third-party documentation or piece together API calls—MCP turns English user queries into seamless integrations the moment my client connects to an MCP server.
Distribution Effect
MCP itself launched in November 2024. It’s only 3 months ago, but in the age of AI, it feels like years.
Back then, nobody really reacted.
Now, suddenly, the power of the scalable AI community on X made MCP go viral. Everyone’s talking about it—those just jumping on it are making the wildest takes, those who get the connection are making money, and everyone is finally seriously looking into this new LLM capability and the opportunity it brings to deliver on Adaptive Computing.
Add product and distribution to AI research, and the magic happens.
A Word of Caution
Back in November, I was excited about MCP and quickly began integrating the first available MCP servers into my Scheduling Assistant, since it connects LLMs with Calendar and Gmail.
However, those early MCP servers offered only a limited set of “tools”, that weren't sufficient to deliver a meaningful, complete experience. As a result, I ended up having to build my own custom LLM integrations to support the practical use cases I had in mind.
My hope is that this renewed enthusiasm around MCP will lead to servers built for real-world scenarios, rather than just flashy demos that momentarily impress the X crowd, but don't hold up in production environments.