Skip to main content
Skybridge enables you to build ChatGPT Apps and MCP Apps - interactive UI widgets that render inside AI conversations. Before diving into Skybridge’s APIs, understand the underlying protocols and runtimes it builds upon.

How widgets appear in MCP Apps Clients

In MCP Apps clients, the model triggers your tool, and the client then renders both the assistant response and your widget from the tool result. MCP Apps Architecture

MCP (Model Context Protocol)

MCP is an open standard that allows AI models to connect with external tools, resources, and services. Think of it as an API layer specifically designed for LLMs.

What is an MCP Client?

An MCP Client is a frontend application that implements the MCP protocol, and that can consume MCP Servers. Major MCP Clients include:
  • General-purpose AI apps: ChatGPT, Claude, Goose, etc
  • IDEs: Cursor, VSCode, Amp, etc
  • Coding agents: Claude Code, Codex CLI, Gemini CLI, etc
  • Any other software that implements the MCP protocol

What is an MCP Server?

An MCP server is a backend service that implements the MCP protocol. It exposes capabilities to MCP Clients through:
  • Tools: Functions the model can call (e.g., search_flights, get_weather, book_hotel)
  • Resources: Data the model can access (e.g., files, database records, UI components)
When you ask an AI assistant a question, it can invoke tools on your MCP server to fetch data or perform actions on your behalf. The server handles your business logic, database queries, API calls, and any other backend operations.

MCP Apps and ChatGPT Apps: The Same Foundation

MCP Apps is the open UI extension for MCP. It defines the portable contract for interactive widgets in AI clients, including the ui/* bridge, tools/call, and _meta.ui.resourceUri. ChatGPT Apps use that same MCP Apps contract in ChatGPT, and the OpenAI Apps SDK adds window.openai APIs for ChatGPT-specific capabilities. To avoid repetition, we will now refer to both ChatGPT Apps and MCP Apps as AI Apps. An AI App consists of two components working together:
  1. MCP Server: Your backend that handles business logic and exposes tools via the MCP protocol
  2. UI Widgets: HTML components that render in the AI Client’s interface as interactive UIs
When a tool is called, it can return both:
  • Text content: What the model sees and responds with
  • Widget content: A visual UI that renders for the user
This creates a dual-surface interaction model: users interact with both the conversational interface (the AI) and your custom UI (the widget).
Read our in-depth blog article for a detailed technical breakdown of how AI Apps work under the hood.

Runtime Environments

Both ChatGPT Apps and MCP Apps use the same MCP server architecture and the same portable MCP Apps UI contract. The key practical difference is that ChatGPT additionally exposes window.openai extensions. Think of it this way: your app logic and portable bridge stay the same, and ChatGPT can optionally provide extra capabilities. Skybridge supports the two main runtime environments for rendering widgets:

Apps SDK (ChatGPT)

ChatGPT host runtime. Implements MCP Apps and adds optional window.openai APIs for ChatGPT-only features.

MCP Apps

Open MCP Apps specification. JSON-RPC postMessage bridge that works across multiple AI clients.
Skybridge abstracts away the differences between these runtime environments so you can write your widgets once and run them anywhere. Learn more in our Write Once, Run Everywhere guide.

Runtimes Comparison at a Glance

FeatureApps SDK (ChatGPT)MCP Apps
ProtocolMCP Apps bridge + optional window.openai extensionsOpen MCP Apps (ext-apps) spec
Client SupportChatGPT onlyGoose, VSCode, Postman, …
DocumentationApps SDK Docs and MCP Apps compatibility in ChatGPText-apps specs

Next Steps

Apps SDK Deep Dive

ChatGPT-specific APIs, window.openai, and exclusive features

MCP Apps Deep Dive

The open specification, JSON-RPC bridge, and client support

Write Once, Run Everywhere

How Skybridge abstracts these differences for you

Quickstart

Start building your first app