Skip to main content
Skybridge enables you to build ChatGPT Apps and MCP Apps - interactive UI widgets that render inside AI conversations. Before diving into Skybridge’s APIs, understand the underlying protocols and runtimes it builds upon.

MCP (Model Context Protocol)

MCP is an open standard that allows AI models to connect with external tools, resources, and services. Think of it as an API layer specifically designed for LLMs.

What is an MCP Client?

An MCP Client is a frontend application that implements the MCP protocol, and that can consume MCP Servers. Major MCP Clients include:
  • General-purpose AI apps: ChatGPT, Claude, Goose, etc
  • IDEs: Cursor, VSCode, Amp, etc
  • Coding agents: Claude Code, Codex CLI, Gemini CLI, etc
  • Any other software that implements the MCP protocol

What is an MCP Server?

An MCP server is a backend service that implements the MCP protocol. It exposes capabilities to MCP Clients through:
  • Tools: Functions the model can call (e.g., search_flights, get_weather, book_hotel)
  • Resources: Data the model can access (e.g., files, database records, UI components)
When you ask an AI assistant a question, it can invoke tools on your MCP server to fetch data or perform actions on your behalf. The server handles your business logic, database queries, API calls, and any other backend operations.

MCP Apps and ChatGPT Apps: The Same Foundation

Both ChatGPT Apps and MCP Apps are built on top of MCP servers, and extend the MCP protocol to allow rich UI rendering. Both use the same foundations and concepts, but differ in implementation: ChatGPT Apps are built on top of the OpenAI Apps SDK, while MCP Apps are built on top of the MCP Apps extension. To avoid repetition, we will now refer to both ChatGPT Apps and MCP Apps as AI Apps. An AI App consists of two components working together:
  1. MCP Server: Your backend that handles business logic and exposes tools via the MCP protocol
  2. UI Widgets: HTML components that render in the AI Client’s interface as interactive UIs
MCP Apps Architecture When a tool is called, it can return both:
  • Text content: What the model sees and responds with
  • Widget content: A visual UI that renders for the user
This creates a dual-surface interaction model: users interact with both the conversational interface (the AI) and your custom UI (the widget).
Read our in-depth blog article for a detailed technical breakdown of how AI Apps work under the hood.

Runtime Environments

While both ChatGPT Apps and MCP Apps use the same MCP server architecture, they differ in how they render and communicate with UI widgets. Think of it this way: your App is the engine, and the runtime environment is the interface layer. Skybridge supports the two main runtime environments for rendering widgets: Skybridge abstracts away the differences between these runtime environments so you can write your widgets once and run them anywhere. Learn more in our Write Once, Run Everywhere guide.

Runtimes Comparison at a Glance

FeatureApps SDK (ChatGPT)MCP Apps
ProtocolProprietary window.openaiOpen MCP ext-apps spec
Client SupportChatGPT onlyGoose, VSCode, Postman, …
DocumentationApps SDK Docsext-apps specs

Next Steps