Expose tunnel
Remote MCP clients (ChatGPT, claude.ai, mobile apps, cloud agents) cannot launch a local stdio process on your machine. To reach a local Neotoma instance they need an HTTPS URL. An HTTPS tunnel forwards MCP traffic from the public internet to your local API server while your data stays on your machine.
Neotoma ships with built-in tunnel support via neotoma api start --tunnel. You can also use any standard reverse-tunnel tool pointed at the local API port.
Built-in tunnel
neotoma api start --env prod --tunnelStarts the local API server and opens a tunnel. The public URL is printed to stdout once the tunnel is established. Pass --tunnel-provider <name> to select a specific provider (default is auto-detected).
Auth is required for writes; unauthenticated callers can only read public discovery endpoints (/server-info, /.well-known/*).
Provider setup
Neotoma auto-detects the tunnel provider. Set --tunnel-provider ngrok or --tunnel-provider cloudflare to override. The local port is 3080 (prod) or 3180 (dev) by default.
ngrok
Requires a free ngrok account. After installing, authenticate once:
ngrok config add-authtoken YOUR_AUTHTOKENFor a stable URL across restarts, set HOST_URL in .env to your ngrok reserved/custom domain (e.g. https://your-subdomain.ngrok-free.app). Otherwise a random URL is generated each run.
Manual alternative: ngrok http 3080
Cloudflare Tunnel
Install cloudflared (brew install cloudflare/cloudflare/cloudflared).
Quick tunnel (ephemeral URL, no config): cloudflared tunnel --url http://localhost:3080
Named tunnel (stable URL): set HOST_URL in .env to your public hostname. Ensure ~/.cloudflared/config.yml declares the tunnel and routes ingress to http://localhost:3080.
Tailscale Funnel
Requires Tailscale with Funnel enabled on your tailnet. No Neotoma-side config needed.
tailscale funnel 3080Once the tunnel is up, use the public URL as your MCP server URL in ChatGPT, claude.ai, or any remote client. The tunnel forwards all traffic: MCP at /mcp, Inspector at /app, REST API at standard endpoints.
When to use a tunnel
- ChatGPT and claude.ai, web-based clients that connect to MCP servers over HTTPS.
- Mobile and tablet: agents running on devices that cannot reach localhost on your development machine.
- Codex and OpenClaw, cloud agents and hosted services that need to write observations or read state from your Neotoma instance.
- Multi-machine: when you run agents on multiple machines and want a single source of truth. See hosted flavors for persistent deployment options.
See connect for the full list of client-specific setup guides, and AAuth for agent identity verification over tunneled connections.
Remote auth
Writes through the tunnel require authentication. Configure at least one of these in .env before starting the tunnel:
NEOTOMA_BEARER_TOKEN: quick start; remote clients pass this as a Bearer token.NEOTOMA_KEY_FILE_PATHorNEOTOMA_MNEMONIC: key-authenticated MCP OAuth (agents sign requests with AAuth).
Without any of these, the tunnel setup warns you and remote writes are rejected. Unauthenticated callers can still read discovery endpoints (/server-info, /.well-known/*).
For additional control, use tunnel-provider features (IP allowlists, password protection) or restrict the tunnel to your Tailscale network.
See hosted flavors for an overview of deployment options, connect for client-specific setup, and AAuth for agent identity over tunneled connections.