Tools for building with LLMs

A collection of AI tools that can help you build with Webflow APIs more efficiently.

Large Language Models (LLMs) are making it easier to build with Webflow’s APIs. Use these tools to access real payloads and schemas for your sites and collections, generate code, and streamline your Webflow API development workflow.

Documentation for LLMs

Webflow’s documentation is optimized for consumption by AI assistants, making it easier for these tools to generate accurate code examples and guidance.

  • Use https://developers.webflow.com/llms.txt to access the LLM-readable documentation. This optimized structure helps LLMs respond with accurate code snippets and multi-step sequences.
  • Additionally, you can access markdown versions of any documentation page to provide a more structured and context-rich experience for LLMs. To access the markdown version of a page, add .md to the end of the URL. For example, this current doc is available as a markdown file at https://developers.webflow.com/data/docs/ai-tools.md.

Installing docs on Cursor

1In the chat, click the @ button
2Find the “Docs” option
3Click “Add new doc”
4Paste in the following link: https://developers.webflow.com/llms.txt

Once configured, reference Webflow’s documentation by typing @Docs in your chat window and selecting “Webflow” from the list.


Webflow MCP server

For developers using AI-powered tools like Cursor or Claude Desktop, we provide a Model Context Protocol (MCP) server that enhances the AI’s understanding of your Webflow projects. The MCP server has tools that enable the AI agent to access real-time information about your sites, collections, and other objects, enabling more accurate and contextual code suggestions and troubleshooting. To see a full list of tools, see the MCP server documentation.

Installation for AI clients

  1. Get a Webflow API token from the Webflow API Playground

    Webflow API token
  2. Add the following snippet to your client’s configuration file:

    1{
    2 "mcpServers": {
    3 "webflow": {
    4 "command": "npx",
    5 "args": ["-y", "webflow-mcp-server@0.2.0"],
    6 "env": {
    7 "WEBFLOW_TOKEN": "YOUR_API_TOKEN"
    8 }
    9 }
    10 }
    11}

    Remember to replace YOUR_API_TOKEN with your actual Webflow API token.

  3. Add the MCP server to your AI client:

    1Go to Settings → Cursor Settings → MCP
    2Click + Add New Global MCP Server
    3Paste configuration into .cursor/mcp.json
    Cursor MCP
    4Save and verify server status
    Cursor MCP
    5

    Start interacting with the MCP server

    In the “Chat” window, switch to “Agent Mode” and start interacting with the MCP server. You can ask the agent things like:

    • “When was my site last published?"
    • "What were the last 5 CMS items published to this site?"
    • "Based on my last 5 blog posts, can you generate some ideas for new blog posts?"
    • "What are the current SEO issues on my site and how can I fix them?”
    6

    Send us feedback!

    We’re just getting started with this, so we’d love to hear from you! If you’d like to see more tools or have any feedback, please log an issue on GitHub.

FAQs & Troubleshooting

After installing the MCP server, you may need to restart your AI client to see the new server. Additionally, check to see that your client (e.g. Cursor, Claude Desktop) is updated to the latest version.

Additionally, you may need to specify the version of the Webflow MCP server you are using by using the @ symbol. For example, webflow-mcp-server@0.2.0 vs webflow-mcp-server.

Built with