Build and Automate Anything with n8n + MCP: Run AI Workflows Locally

Automate AI workflows with n8n and MCP! Run powerful AI agents locally, connect to services, and leverage large language models. Learn how to set up a flexible, developer-friendly AI workflow in this comprehensive guide.

April 21, 2025

party-gif

Unlock the power of local AI with n8n and MCP! Discover how to build and automate anything, from LLMs to AI agents, all within a secure and efficient Docker environment. Explore a flexible, developer-friendly approach to creating smarter AI workflows that seamlessly integrate external data sources and services.

Why Docker is a Game-Changer for Automation

Think of Docker as a virtual container system. It lets you run applications in isolated, self-contained environments called containers. This ensures everything runs smoothly without dependency issues or system conflicts, which is especially powerful while working with Nathan and MCP.

With Docker, you can spin up Nathan in seconds and integrate the MCP servers quite quickly. This makes the setup process easy, clean, portable, and secure. Docker Desktop fully integrates with your favorite development tools, simplifying the deployment and dramatically accelerating your workflow efficiency.

By using Docker, you can easily configure tools like MCPs without installing complex dependencies directly onto your machine. Docker containers provide a consistent, isolated environment, allowing you to focus on building your AI workflows without worrying about system-level conflicts or compatibility issues.

Setting Up n8n with Docker

To set up n8n with Docker, follow these steps:

  1. Download and install Docker Desktop for your operating system.

  2. Pull and install the official n8n Dockerized container or image:

    • Go to the "Containers" section in Docker Desktop and search for "n8n".
    • Find the official n8n container and click "Pull".
  3. Configure the n8n container:

    • Go to the "Images" section and click the "Run" button.
    • Click on "Optional Settings" and give the container a name.
    • Add the port "5678" to the container.
    • Set a host path for the volume, e.g., create a "n8n" folder in your documents directory and use that path.
    • Set the container path to the provided path.
    • Enable the environment variables by setting "GENERIC_PROXY_ENABLED" to "true".
  4. Start the n8n container by clicking "Run".

  5. Access the n8n web interface by clicking on the running container and opening the local host URL.

  6. Create an account and start building your workflows.

  7. Configure the MCP (Model Context Protocol) integration:

    • Go to the n8n settings, navigate to "Community Nodes", and install the "nodes-mcp" node.
    • Create a new workflow and add a "Chat Trigger" node.
    • Connect the trigger to a "Chat Model" node and configure the OpenAI API key.
    • Add an "MCP Client" node and configure the credentials for the desired MCP server (e.g., Brave Search).
    • Add another "MCP Client" node to execute the MCP tool (e.g., web search).
    • Test the workflow by sending a chat message and observing the MCP-powered response.

With Docker, you can easily set up a consistent and isolated environment to run n8n and integrate MCP-powered AI agents, enabling a powerful local AI workflow.

Configuring and Setting Up MCPs

To configure and set up MCPs (Model Context Protocol) within the Nathan AI automation platform, follow these steps:

  1. After launching the Nathan container, click on the three dots in the top-right corner and go to the "Settings" section.
  2. In the settings, navigate to the "Community Nodes" tab and click on the "Install" button for the "nodes-mcp" node.
  3. Click on "I understand the risks" and then "Install" to add the MCP-related node to your Nathan instance.
  4. Once the installation is complete, head back to the main workflow canvas.
  5. Give your workflow a name, and then add a "Chat Trigger" node to start the workflow.
  6. Connect the Chat Trigger to an AI agent, such as the OpenAI model. Provide the necessary API credentials.
  7. Add a "Memory" node to your AI agent to enhance its capabilities.
  8. Click on the "+" sign and search for the "MCP Client" node. This is the node you just installed from the Community Nodes.
  9. Select the "MCP Client" node and configure the credentials by creating a new credential.
  10. In the credential settings, install the MCP server using the mpx -Y <protocol> command, where <protocol> is the specific MCP server you want to integrate (e.g., Brave Search, GitHub, etc.).
  11. Set the necessary environment variables for the MCP server, such as the Brave Search API key.
  12. Save the credential and close the settings.
  13. In the operations for the MCP Client node, select "List Tools" to test the integration and see the available tools.
  14. Add another MCP Client node to execute a specific tool, such as the "web search" tool.
  15. Configure the tool parameters as needed, and then connect the MCP Client node to your AI agent.
  16. You can now test the workflow by interacting with the AI agent, which will utilize the integrated MCP tools to perform various tasks.

By following these steps, you have successfully configured and set up MCPs within your Nathan AI automation workflow, allowing your AI agents to interact with external data sources, services, and local resources.

Building an AI Agent Workflow with MCPs

Docker is a game-changer for automation, allowing you to run applications in isolated, self-contained environments called containers. This ensures everything runs smoothly without dependency issues or system conflicts, making it especially powerful when working with tools like Nathan and MCPs (Model Context Protocol).

First, you'll need to download and install Docker Desktop for your operating system. Once installed, you can pull and run the official Nathan Dockerized container. Be sure to configure the necessary environment variables, such as enabling the Nathan nodes MCP client, which is disabled by default for security reasons.

With Nathan running in your Docker container, you can now set up the MCP integration. Install the "nodes-mcp" community node, which will allow you to interact with MCPs directly within Nathan. This provides your AI agents with additional functionality and the ability to utilize various plugins, enhancing their flexibility in completing tasks.

To create an AI agent workflow, start by adding a chat trigger, which will execute whenever a chat message is received. Connect this to an AI model, such as the OpenAI model, by providing the necessary API credentials. Then, add an MCP client node to your workflow, allowing your agent to list available tools and select the appropriate credentials.

For example, you can integrate the Brave Search MCP by copying the MPX configuration from the provided GitHub repository and setting the necessary environment variables. This enables your agent to execute the "web search" tool provided by the Brave Search MCP.

By combining Docker, Nathan, and MCPs, you can build a powerful, open-source, and completely free local AI workflow. This setup allows your AI agents to interact with real-world tools, APIs, browsers, and local services, expanding their capabilities and enabling them to assist you in various ways.

Conclusion

With Docker, we get a clean and consistent environment to run complex tools without worrying about messy dependencies. Nan is going to add powerful node automation and orchestration, making it easier to connect everything visually. And then, we have MCPs which will let you bring true AI agent capabilities, allowing large language models to interact with real-world tools like APIs, browsers, and local services.

Together, we saw how amazing this local AI workflow can be - it's open-source, completely free, and can assist you in multiple ways. I hope you enjoyed today's video and got some value out of it.

Make sure to check out all the links in the description below, subscribe to the second channel, follow me on the newsletter, join our Discord, follow me on Twitter, and lastly, subscribe to the YouTube channel, like this video, and take a look at our previous videos - there's a lot of content that you will truly benefit from.

With that said, have an amazing day, spread positivity, and I'll see you guys really shortly. Peace!

FAQ