Model Context Protocol: The Missing Layer Between AI and Your Apps

Model Context Protocol: The Missing Layer Between AI and Your Apps

Open Spaces is Gun.io’s deep dive into tech through the eyes of our engineering community. This edition comes from community member Tim Kleier—Tech Lead (and veteran backend/full-stack engineer & solutions architect), whose toolkit runs on JavaScript, Ruby on Rails, and PostgreSQL. Tim unpacks Model Context Protocol (MCP), the “USB-C for AI” that lets LLMs like Claude or ChatGPT plug into calendars, CRMs, and codebases to drive real-time, personalized workflows. Whether you’re on a morning run or buried in terminal windows, MCP shows how the next wave of AI will act on your data, not just talk about it.


You’re out on your morning run, earbuds in, enjoying the crisp air. Suddenly, you remember you have a meeting right when you get into the office. You tap your earbuds and say:

Article content

“Hey Claude, what’s my first meeting today?”

Claude responds:

“8:30 a.m. with Acme Corp. Would you like a quick brief?”

You reply:

“Yes. Summarize the Hubspot notes and remind me of key meeting objectives.”

While you continue your run, Claude:

  • Checks your Google calendar for event details.
  • Retrieves client information from Hubspot.
  • Summarizes Hubspot notes and meeting objectives.

This seamless interaction isn’t a distant future—it’s the emerging reality enabled by something called Model Context Protocol (MCP).

For a wider view on why that matters, see future of AI assistance.

The Current Challenge with LLMs

Large Language Models (LLMs) like Claude and ChatGPT have transformed how we interact with information. However, they have a significant limitation: they lack direct access to real-time, personalized data. In other words, they are blind to the outside world. 

Without structured access to tools and context, their responses can be generic and disconnected from specific user needs. Integrating LLMs with various tools–calendars, CRMs, email clients–often requires custom, brittle solutions. There’s a pressing need for a standardized, secure method to connect LLMs with the diverse data sources and applications users rely on daily.

Introducing the Model Context Protocol (MCP)

MCP is an open standard designed to bridge the gap between LLMs and the tools they need to interact with. You can think of MCP as the USB-C of AI applications—a universal connector that allows LLMs to interface with various systems seamlessly.

Here are some of MCP’s key features:

  • Interoperability: MCP enables any LLM to interact with any tool or data source, eliminating vendor lock-in.
  • Modularity: Its architecture separates concerns into hosts, clients, and servers, allowing for flexible integration.
  • Simplicity: MCP simplifies complex interactions with tools down to the requesting system, the action you want to take, and any input needed.
  • For engineering teams eyeing production rollout, don’t miss our guide to scaling LLM infrastructure

How MCP Works

MCP operates on a client-server architecture, with three key components:

  • Hosts: Applications like Claude Desktop that initiate requests.
  • Clients: Interfaces that facilitate communication between hosts and servers.
  • Servers: Services that expose specific capabilities or data sources to the host via the client.

The diagram below shows a sample architecture of MCP interactions from Claude to our CRM (Hubspot) and calendar (Google). 

Article content

This interaction between the MCP host and servers is done dynamically, based on tool discovery, where the LLM uses the conversation to choose the right tools for the job. 

  • Checks your Google calendar for event details. google_calendar_list_events
  • Retrieves client information from Hubspot. hubspot_get_account
  • Summarizes Hubspot notes and meeting objectives. 

Obviously, each of these tools needs input, like a calendar ID for retrieving events and the Hubspot account ID, but that’s a little too in-depth to cover in this introductory article. The point is that MCP is a powerful architecture that simplifies interaction with functional capabilities in other systems.

Real-World Applications

With MCP, the possibilities for enhancing AI interactions are vast:

  • Personal Assistants: LLMs can access calendars, emails, and documents to provide personalized assistance.
  • Business Workflows: Automate tasks like summarizing meetings, drafting emails, and updating CRMs.
  • Internet of Things: AI-empowered communication across connected devices
  • Development Tools: Integrate with IDEs to assist in code generation, debugging, and documentation.

Who Builds What?

Rolling out MCP can involve backend engineers, DevOps, and AI specialists. Deciding whether you need an ML engineer or a strong backend dev? See AI/ML engineer vs. backend developer for a quick framework.

What’s Next in This Series

This article is the first in a series exploring the practical applications of MCP:

  1. Wrapping an Existing API with MCP: Learn how to expose your current APIs to LLMs using MCP.
  2. Building a Standalone MCP Server: A step-by-step guide to creating a server that interfaces with various tools.
  3. Integrating MCP with Business Tools: Discover how to connect LLMs with CRMs, project management tools, and more to streamline workflows.

Stay tuned as we delve deeper into each of these topics, providing code examples, best practices, and insights to help you harness the full potential of MCP in your projects.

To view or add a comment, sign in

More articles by Gun.io

Others also viewed

Explore topics