Self-Hosting OpenMemory MCP on Proxmox with Tailscale

The Goal I wanted a persistent AI memory layer — something that stores context across conversations and tools, accessible from Claude Desktop, Claude Code, and eventually other MCP clients. The official mem0 platform exists, but I wanted to self-host it on my Proxmox cluster for control and privacy. The Stack The deployment runs on mem01 (LXC 3003, pve02) with three Docker containers: Ollama — LLM inference for embeddings (bge-m3) and chat (qwen3:8b) OpenMemory — the MCP server itself, using SQLite for vectors and metadata Open WebUI — optional web interface for testing I started with mem0-mcp-selfhosted (Neo4j + Qdrant + Python SDK) but it crashed repeatedly and had a painful dependency chain. OpenMemory — SQLite for everything, Node.js SDK — just worked. ...

March 31, 2026 · 4 min · Adam Behn