Skip to content

Building ChatGPT Apps on Cloudflare with the OpenAI Apps SDK

Posted on:December 19, 2025

Introduction to ChatGPT Apps and the OpenAI SDK

OpenAI’s new ChatGPT Apps SDK turns ChatGPT into a platform for interactive applications that blend conversation with dynamic interfaces. These apps live inside the ChatGPT chat – they respond to natural language and can display interactive UI elements right in the chat interface . For example, an app can show maps, playlists, forms, or even games as part of the conversation. This makes for a more engaging experience where users can chat and interact with content simultaneously. From the developer’s perspective, building with the Apps SDK opens access to ChatGPT’s enormous user base (over 800 million users) and allows apps to appear contextually when relevant .

Importantly, the Apps SDK is built on the Model Context Protocol (MCP) – an open standard for connecting ChatGPT to external tools and data . OpenAI has made the Apps SDK open source so that you can host your app on any platform that supports this standard . This means you’re not limited to OpenAI’s servers; you can run your ChatGPT-connected app on your own backend. Cloudflare has embraced this by providing an excellent environment to deploy ChatGPT apps on the edge. In this guide, we’ll explore how to build ChatGPT Apps using Cloudflare’s platform, taking advantage of its global serverless infrastructure, storage options, and new tools that extend beyond what other platforms offer.

Why Build ChatGPT Apps on Cloudflare?

Cloudflare Workers, Cloudflare’s serverless JavaScript/TypeScript platform, is a powerful host for ChatGPT apps. By deploying your app on Cloudflare, you get instant global distribution (low latency for users worldwide) and seamless scaling without managing servers. Cloudflare has even introduced an Agents SDK that works with OpenAI’s Apps SDK, making it easier to implement the MCP server and app logic in a Cloudflare Worker . With Cloudflare, your ChatGPT app can leverage unique capabilities, for example:

Moreover, Cloudflare offers various storage and compute services that you can integrate into your ChatGPT app. You can use Cloudflare D1 (a distributed SQL database) to store application data or knowledge bases, KV storage for simple key-value caching, and R2 object storage for files or large assets – all accessible from within your Worker. Cloudflare’s global network ensures your app’s API endpoints and assets are delivered quickly to ChatGPT, minimizing latency when ChatGPT invokes your app. In short, Cloudflare’s platform provides the scalability, state management, and developer tools to build ChatGPT apps that go beyond basic functionality.

Setting Up Your Cloudflare ChatGPT App Environment

Before diving into coding, make sure you have the necessary accounts and tools ready:

With the prerequisites ready, we can start building the project.

Creating the Cloudflare Workers Project

1. Initialize a new Cloudflare Workers project: Cloudflare provides a template to quickly set up a project. In your terminal, run the command:

npm create cloudflare@latest my-chatgpt-app

This will scaffold a Cloudflare Worker project in a new folder (named my-chatgpt-app in this example) with the necessary configuration files . You can also use Yarn or PNPM to initialize similarly . Navigate into your project directory after creation.

2. Install required dependencies: Inside your project, install the libraries that will help build the ChatGPT app. Specifically, you’ll need the OpenAI MCP SDK and Cloudflare’s Agents SDK, plus any UI libraries. For example, run:

npm install agents @modelcontextprotocol/sdk

This pulls in Cloudflare’s agents package (which provides helpful abstractions for MCP and Durable Objects) and the official @modelcontextprotocol/sdk . Additionally, if you plan to build a UI with React (as many apps do for interactive widgets), install react, react-dom, and any UI components or styling libraries you need. The Cloudflare example uses a chessboard UI, so it included packages like react-chessboard for that purpose . For development tooling, you might also install bundlers like Vite and any necessary plugins (for instance, to bundle your React app into a single file to serve easily) .

3. Configure Durable Objects and settings: If your app needs state or multi-user coordination, configure a Durable Object in your Cloudflare Worker. In the project’s wrangler.toml or wrangler.jsonc config, define a Durable Object binding. For example, to use a Durable Object class named TodoStore, you’d add something like:

// wrangler.jsonc excerpt
{
  "durable_objects": {
    "bindings": [
      { "name": "TODO", "class_name": "TodoStore" }
    ]
  },
  "compatibility_flags": ["nodejs_compat"]
}

This registers a Durable Object class with Cloudflare, allowing the Worker to instantiate it as needed. Durable Objects give you isolated, persistent state – perfect for things like game sessions, shared documents, or any data that should persist and be synced between users . Also note the compatibility flags if needed (e.g. enabling certain Node APIs in the Worker). Save your config after adding DO bindings or other settings (like asset bundling for the UI).

Building the ChatGPT App Backend (MCP Server)

The core of a ChatGPT app is the MCP server – essentially your app’s backend that ChatGPT will communicate with. On Cloudflare, this backend is implemented in the Worker (the src/index.ts or similar entry file). Here’s how to build it:

1. Initialize the MCP server: Use the MCP SDK to create a server instance in your code. For example:

import { McpServer } from "@modelcontextprotocol/sdk";

const server = new McpServer({ name: "MyApp", version: "0.1.0" });

This sets up an MCP server with a name and version . The name is what will appear in ChatGPT as the app’s identifier.

2. Register your app’s UI resource: If your app has a custom UI, you need to tell ChatGPT how to retrieve and display it. Typically, you’ll have an HTML file (possibly generated from a React build) that acts as the widget. Register it with the server as a resource. For example, Cloudflare’s chess app registers a resource like:

server.registerResource(
  "chess",                      // resource name or identifier
  "ui://widget/index.html",     // resource URI used in templates
  {},
  async (_uri, extra) => {
    return {
      contents: [{
        uri: "ui://widget/index.html",
        mimeType: "text/html+skybridge",
        text: await getWidgetHtml(extra.requestInfo?.headers.host as string)
      }]
    };
  }
);

This code (from the chess example) tells the server that when ChatGPT requests the ui://widget/index.html resource, the Worker should respond with the HTML content of the widget . In practice, getWidgetHtml(…) would return the compiled HTML of your React app (with any needed script and style included). The mimeType: “text/html+skybridge” is a special content type indicating an interactive ChatGPT widget. You will customize this to serve your app’s UI. Cloudflare’s Workers can serve static content or generated HTML easily – often, the UI build process will produce an HTML and some JS/CSS which you can embed or inline for delivery.

3. Register tools (actions) that ChatGPT can call: Tools are the operations your app exposes to ChatGPT. Each tool usually corresponds to a specific user intent or action – for example, “search database”, “start a new game”, “add a task”, etc. When ChatGPT decides to use your app, it will invoke these tools through the MCP server. You register tools with a name, some metadata, and a function to execute. For instance, the chess app registers a playChess tool like so:

server.registerTool(
  "playChess",
  {
    title: "Renders a chess game menu, ready to start or join a game.",
    annotations: { readOnlyHint: true },
    _meta: {
      "openai/outputTemplate": "ui://widget/index.html",
      // ... (other meta like messages to show when invoking)
    }
  },
  async (_, _extra) => {
    // The implementation when ChatGPT calls this tool
    return { content: [{ type: "text", text: "Successfully rendered chess game menu" }] };
  }
);

Here, playChess is a tool that doesn’t take input parameters – it simply instructs ChatGPT to display the chess UI by referencing the widget’s URI . The _meta.openai/outputTemplate points to the UI resource, meaning the output of this tool is the embedded UI. In your app, you might have multiple tools. For example, a to-do app might have tools like addTask, listTasks, or showStats, each triggering some UI or returning info. Provide clear titles and optionally descriptions for each, as these help ChatGPT understand when to use them.

4. Implement the app’s logic and state handling: If your app requires complex logic or state (which many do), implement that on the Worker side. For simple stateless tools, you can handle everything in the tool function itself. But for more complex scenarios, it’s advisable to use classes or modules – especially with Durable Objects for stateful logic. Cloudflare’s Agents framework allows you to define an Agent class (often extending from the base Agent class) that can hold state and expose methods to be called. In the chess example, they defined a ChessGame class extending Agent, with an internal chess engine and a state (board positions, players, game status) stored in a Durable Object . Each instance of ChessGame is tied to a Durable Object (so each game has isolated state, which is automatically persisted and synced across players).

Within such classes, you can mark certain methods as callable by ChatGPT or the UI. Cloudflare’s SDK uses a @callable() decorator to expose methods. For example, in a game or collaborative app, you might have:

export class ChessGame extends Agent { 
    // ...initial state...
    @callable()
    join(params: { playerId: string }) {
        // logic for a player joining the game
        // update state, etc.
        return { ok: true, state: this.state };
    }

    @callable()
    move(params: { from: string, to: string }) {
        // logic to make a chess move, update state
        return { ok: true, state: this.state };
    }
    // ... other methods ...
}

By marking a method with @callable(), you make it available as a tool/action that the client (ChatGPT or the UI via ChatGPT) can invoke . In practice, ChatGPT calls your top-level server.registerTool functions. Those can in turn delegate to your class methods or Durable Object logic as needed. For instance, a startGame tool might create a new Durable Object instance of ChessGame and then ChatGPT will use other tools that interact with that instance. The key is that Durable Objects give you persistent storage and concurrency control, while callable methods give ChatGPT safe entry points to modify or retrieve state.

5. Handle UI interactions and messages: When your app includes a UI, user interactions in that UI (like button clicks, form submissions, dragging a chess piece, etc.) need to communicate with your backend. In a Cloudflare ChatGPT app, the UI can call back to your Worker (e.g., via fetch calls to specific endpoints or via a WebSocket to a Durable Object). The Cloudflare Agents SDK simplifies this with hooks: for example, a React hook useAgent() can connect your React components to the Durable Object backend easily . This means your UI can call the methods we marked as @callable() simply by calling, say, await stub.move({from: “e2”, to: “e4”}) in the front-end code – the SDK handles routing that to the Durable Object method on the backend. This makes building interactive features much easier.

Additionally, bidirectional communication allows the UI to prompt ChatGPT. For instance, in the chess app, when the user clicks “Ask for help”, the front-end uses window.openai.sendFollowUpMessage(…) to send a new message into the chat with context (like the current board state) . ChatGPT then responds in the conversation (providing chess advice), even though the user triggered it via the UI. This kind of seamless hand-off between UI and AI is a powerful pattern unique to ChatGPT apps.

Building and Serving the App’s UI

Most ChatGPT apps will have a custom UI component – whether it’s a simple HTML output (like a table of results, or a form) or a complex interactive widget (like a game board or map). Cloudflare Workers can serve this UI to ChatGPT.

1. Develop your UI: You can create your UI using web technologies. Commonly, developers use React (with or without a design library like Tailwind CSS, Material UI, etc.) to build the interface. This UI runs inside ChatGPT’s iframe, so you should keep it relatively light and self-contained. For example, the vibe-board to-do app uses React and Tailwind to create its interface within ChatGPT . Ensure your UI is responsive and fits well in the chat panel.

2. Prepare the UI for embedding: When you’re ready to integrate the UI, you’ll want to bundle it into a deliverable format. Typically, you’ll produce an HTML file that includes your app’s JS and CSS. Tools like Vite can help compile and bundle your React app into a single HTML (using plugins such as vite-plugin-singlefile to inline scripts) . The goal is to have an HTML entry point that the ChatGPT client can load to render your widget. In your Cloudflare Worker code (as shown earlier with server.registerResource), you’ll serve this HTML content.

3. Serving static assets via the Worker: Cloudflare Workers can include static asset bundles, or you can have your build output placed in a known location for the Worker to read. One approach (used by the official examples) is to embed the UI assets in the Worker script itself or serve them from the Worker’s environment. The vibe-board example, for instance, compiles the React app and then updates a widget-manifest.ts in the Worker code with hashed asset names . Simpler approaches might include Base64 encoding assets or using Cloudflare’s KV store to hold the files. For development ease, you can also serve the UI from a public URL, but for production it’s best to have the Worker serve it so that everything is self-contained.

4. Testing the UI outside ChatGPT (optional): During development, you might want to run your UI in a normal browser to ensure it works independently. You can use a dev server (like npm run dev if using Vite) to serve the React app on localhost for quick iteration . Keep in mind some functionality (like window.openai for sending messages to ChatGPT) won’t work outside ChatGPT, but you can mock or ignore those during UI development. Once satisfied, run your production build process to generate the final widget assets.

Deploying the ChatGPT App to Cloudflare

Once your MCP server (backend) and widget (frontend) are ready, deploying to Cloudflare is straightforward:

1. Publish your Worker: Use the Wrangler CLI to deploy. Run wrangler deploy in your project directory (make sure you’re logged in via wrangler login). This will upload your Worker code to Cloudflare’s network and give you a live URL, typically https://..workers.dev . If you configured a custom domain in Cloudflare for Workers, it could be a custom URL instead. For now, note the Workers.dev URL.

2. Verify the MCP endpoint: By default, the MCP server might be served at a specific path. In Cloudflare’s example template, the MCP server listens on the /mcp path of your Worker URL . For instance, if your Worker is my-chat-app.workers.dev, the MCP endpoint could be https://my-chat-app.workers.dev/mcp. Check your code or documentation to confirm the exact endpoint. This is the address ChatGPT will use to interface with your app.

3. Connect the app in ChatGPT: Now the exciting part – making ChatGPT aware of your app. In the ChatGPT UI, go to Settings > Apps & Connectors (with developer mode on). Click Create to add a new custom app. You’ll be prompted to fill in details:

Once filled, click Create. ChatGPT will now register this connector. (Under the hood, it reaches out to your MCP endpoint to fetch the app’s manifest, available tools, and UI resource info.)

4. Test the app in conversation: Start a new chat (or use an existing one) and invoke your app. You can usually call an app by name or let ChatGPT suggest it. For example, if you built a chess game app, you could simply type “Let’s play chess” in ChatGPT. ChatGPT will recognize the intent and respond by activating the chess app – it will call the playChess tool on your MCP server, which in turn causes your UI widget to render in the chat . You should see your app’s interface (e.g., the chessboard or your to-do list UI) appear inline as part of the conversation. At this point:

If ChatGPT doesn’t immediately use your app when you expect, you can manually force it by invoking the app from the prompt box. Click the “+” icon in the chat input, go to More, and select your app’s name from the list – then issue a command (this ensures the message is directed to use your app) .

5. Iterate and refine: Building ChatGPT apps is an iterative process. You may need to adjust your prompts, tool definitions, or UI based on how ChatGPT interprets user instructions. If you update your Worker code or UI, you’ll have to redeploy and then refresh the connector in ChatGPT to pick up changes. During development, a quick trick is to use a Cloudflare Tunnel (cloudflared) to expose your local dev server to ChatGPT, so you can test without deploying each time . However, this is optional – deploying to a dev workers.dev subdomain works fine for most cases. Just remember that after changes to the UI, ChatGPT might cache the old widget; you may need to remove and re-add the app, or use the Refresh option in the ChatGPT connector settings to fetch the latest version .

Ideas for ChatGPT Apps to Build on Cloudflare

With the foundation in place, the possibilities for what you can build are vast. Here are some of the best and most exciting ideas you can explore, leveraging Cloudflare’s platform to go beyond basic chat interactions:

In all these cases, Cloudflare’s platform provides the performance and flexibility needed. You can mix and match services – for example, using a Durable Object to buffer real-time events, storing user-specific data in D1 or KV, and caching external API responses to stay within rate limits.

Conclusion

Building a ChatGPT app on Cloudflare unlocks a new world of conversational applications that are interactive, stateful, and highly scalable. Thanks to OpenAI’s Apps SDK and the open MCP standard, you have the freedom to host your app logic on Cloudflare’s edge network, close to users and with enterprise-grade reliability. We followed how to set up a Cloudflare Worker as an MCP server, serve a custom UI widget, and connect it to ChatGPT, as well as how to utilize Cloudflare’s unique features like Durable Objects for multi-user state and real-time sync. The result is an app that feels native to ChatGPT, yet can harness any backend logic or external data you need .

As you build your own ChatGPT apps on Cloudflare, keep these best practices in mind: design intuitive tools (so ChatGPT knows when to use them), ensure your UI is user-friendly, and secure any sensitive operations (Cloudflare can help here with built-in security features). With imaginative ideas and Cloudflare’s robust platform, you can create chat-based experiences that go well beyond text — from games and dashboards to personal assistants that truly act on the user’s behalf. This is a rapidly evolving field, so stay tuned to both OpenAI’s and Cloudflare’s documentation for new capabilities. Now it’s your turn to build, deploy, and delight users with your Cloudflare-powered ChatGPT app 🚀.

Sources:

About the author

Stephane Busso
Stephane Busso

Software builder and engineering manager based in New Zealand 🇳🇿. HTDOCS.dev is a medium to share about ai, technologies and engineering.