How To Sandbox or Containerize Your MCP Servers

Sandboxing MCP (Model Context Protocol) servers is an essential step when running MCP servers on a workstation (also known as a local MCP deployment). 

Workstation MCP server deployments are not going away. Despite the increasing availability of remote MCP servers and other MCP deployment options, people and organizations will still want or need to deploy MCP servers on workstations or in their own managed infrastructure. 

However, individuals and organizations alike are often ignorant or overly relaxed about the serious risks of Workstation MCP deployments, which include access to all local files, exfiltration of sensitive data and credentials, and remote command execution.

This blog explains why sandboxing or containerization is essential for Workstation MCP deployments, and how to containerize the MCP servers you use with relative ease.

I’ll also give you a few quick examples to show how MCP Manager offers additional MCP server deployment options that maximize scalability, ease of use, and security, ideal for organizations that want to adopt MCP servers quickly and successfully. 

What Does Sandboxing or Containerizing MCP Servers Mean?

Sandboxing or containerizing MCP servers means running them in an isolated container environment. Without containerization, malicious actors can use a corrupted MCP server to steal sensitive information or implant viruses within your workstation, and potentially your corporate network.

Containerizing an MCP server adds a restrictive layer around it. The MCP server can only access or come into contact with what the container can access, and you control what the container can access. 

Containerizing your MCP server restricts its access to files you explicitly grant it and protects your wider workstation from any dangerous code it executes.

People typically containerize MCP servers using Docker containers. Docker is free, open source, and widely supported across all major operating systems. 

Once you have containerized your MCP server, you can expose it using an NGINX proxy with Supergateway or a secure tunnel to allow remote access to your server. Read how to expose your containerized MCP server.

What Are The Risks of Not Sandboxing/Containerizing Workstation MCP Servers?

When you run an MCP server on your workstation without sandboxing/containerization, it has access to all your local files, which could include:

  • Sensitive data
  • Credentials
  • API keys
  • System files

Some people running MCP servers on workstations also store bearer tokens in plaintext in well-known locations in their file system, making them easy to find and steal.

Attackers can use a malicious or compromised MCP server to access, exfiltrate, or modify any or all of these files. Attackers can also implant and propagate viruses or execute ransomware attacks. 

There’s also the possibility that an innocent MCP server and AI agent could inadvertently exfiltrate, modify, or delete these files. Unfortunately, we can’t rely on AI for common sense or consistently sensible judgment calls, at least not yet!

Why You Should Run MCP Servers In Secure Containers

Running MCP servers inside secure Docker containers has a range of benefits, key amongst them are:

  1. Prevents Data Exfiltration: A containerized server can access only the host files the user decides to expose to it.
  2. Improved Secret Management: Replaces the insecure secret storage practices associated with workstation MCPs.
  3. Resource Control: You can set limits on the container’s CPU and memory usage.

Does Containerizing an MCP Server Provide Complete Security?

No, containerization is not a bulletproof solution. Here’s why: 

Suppose you’re using a containerized MCP server on a workstation that’s connected to a corporate network. In that case, there is still a risk that an attacker can use the MCP server to access and perform malicious actions across that network. For maximum security, you should consider containerizing your MCP servers and running them outside of your corporate network.

Here’s a summary of how secure different approaches to running MCP servers on your workstation/locally are:

  • Highest Security: Running an MCP server outside a corporate network with sandboxing.
  • Medium security: Running an MCP server locally with sandboxing.
  • Lowest security: Running an MCP server locally without sandboxing.

How To Sandbox an MCP Server

Here’s a simplified, step-by-step guide to using a Docker container to sandbox MCP servers:

sandbox MCP servers:

Pre-Requisites:

Here’s a simplified, step-by-step guide to using a Docker container to sandbox MCP servers:

Pre-Requisites:

First, you need to prepare your workstation to build and run Docker images. 

  1. Install Docker Desktop on macOS or Windows, or Docker Engine if you’re using Linux.
  2. Install Node to work with scripts and load environment variables.

Prepare the Environment Variables & Dockerfile

Step 1: Copy “.env.example” file at the root, and rename the copy to “.env”. You will add the configuration and secrets here.

Step 2: Open “.env” file and assign values to the following environment variables (use “npm run gen_key” to generate secure secrets):

# (Required) The MCP server to run inside container

NPM_MCP=”@modelcontextprotocol/server-filesystem”

# (Optional) Pass ‘–stateful’ if your MCP server is stateful, otherwise leave empty

SUPERGATEWAY_EXTRA_ARGS=”–stateful”

# (Required if using NGinx Proxy) Set this if you’re using nginx proxy to secure your connection

ACCESS_TOKEN=”secret_key__please_change”

# (Required if using NGrok Secure Tunnel) Set this if you’re using NGrok tunnel to secure your connection

NGROK_URL=”example-url.ngrok.app”

NGROK_AUTHTOKEN=”secret_key__please_change”

NGROK_BASIC_AUTH=”user:password__please_change”

# (Required if using Pinggy Secure Tunnel) Set this if you’re using Pinggy tunnel to secure your connection

PINGGY_ACCESS_TOKEN=”secret_key__please_change”

PINGGY_BEARER_TOKEN=”secret_key__please_change”

Step 3: Depending on which approach you want to take, select one of the 3 Dockerfiles that we provide here and place it at the root of this repository (replacing the existing Dockerfile).

Step 4: Build the Docker image using the “npm run build” script

Step 5: Start the Docker image the “npm run start” script

You can find the source code for the build and start scripts in our MCP Checklists GitHub repository, within the infrastructure/package_scripts directory. These simple scripts load up environment variables from the .env file, and forward them along to the respective Docker command (“docker build” & “docker run”).

Technical Notes For Building Docker Images:

  • If possible base your Docker images from Alpine, which offers the smallest image size
  • If you run into compatibility issues, base your Docker images from Slim versions (e.g. “node:$NODE_VERSION-slim). These are usually Debian based, offer more utilities, and wider compatibility.
  • Never access secrets during the build stage of the Dockerfile. Any variables you access while building the Docker image become embedded in it. All secrets should be passed to the MCP server at runtime, which happens automatically when you run: 

For a more detailed walkthrough – complete with Docker files – see the guide to securing local MCP servers

Robust & Scalable MCP Containerization For Businesses

While the process above provides an easy way for technically savvy or adventurous individuals to sandbox MCP servers in a secure container, this manual, machine-by-machine approach is neither appealing to non-technical users, scalable, nor easy to manage at scale.

At MCP Manager, we’ve discovered that simple containerization isn’t sufficient for most businesses, as it lacks the scalability, consistency, and control they need from their MCP deployments. 

In response, we have developed new approaches to MCP deployment that meet the needs of enterprises adopting MCP servers. These new deployment approaches allow organizations to manage their own deployment internally, and create a mix of dedicated and shared “managed” deployments.

Managed deployments enable organizations to provision MCP servers securely at scale while retaining control and consistency, and to provide secure access to local files and organizational resources as required.

To learn more about how MCP Manager makes MCP deployments easy, secure, scalable, and successful, read about our deployment services and solutions, or get moving faster by scheduling your 1-1 demo now:

Ready to give MCP Manager a try?

Learn More

MCP Manager secures AI agent activity.