Capture
Conversations, files, and external sources flow in via MCP, REST API, or auto-sync.
Claude Code, Codex, Gemini, Cursor — they all pick up where you left off. Connect via MCP, CLI, or REST API — the memory cloud for AI agents. For developers and non-coders alike.
$ claude mcp add --transport http kagura-memory https://memory.kagura-ai.com/mcp Codex, Cursor, and Gemini use the same MCP config. CLI and REST API also available. Self-hosted? Just swap the URL.
Re-explaining the same context to Claude Code — again.
Searching chat history for an answer you know you got — somewhere.
Knowledge trapped in one person's head, invisible to the team.
Don't let your AI conversations — or your agent's work — go to waste.
Conversations with AI, work done by agents — all become memory. The more you use it, the more it organizes and refines itself into a living knowledge base.
Conversations automatically become searchable knowledge. No manual saving.
Merges duplicates, strengthens connections, resolves contradictions — your knowledge base refines itself as you use it.
Memories shared across Claude, Cursor, Gemini CLI, and any MCP-capable client, plus REST API.
Personal insights become team assets. Shared contexts, role-based access.
And memory that compounds becomes your knowledge asset.
Without memory, AI starts from scratch every time.
With memory, AI grows into your own knowledge.
Four steps from raw input to a compounding knowledge base.
Conversations, files, and external sources flow in via MCP, REST API, or auto-sync.
Your AI structures each fact into a 3-layer schema — summary, context, content. Sleep Maintenance consolidates nightly.
Triple index — keyword (BM25), semantic (vector), and relational (graph). Hybrid Search with AI Reranker.
Hebbian learning strengthens connections every time you recall — your knowledge base gets smarter without LLM cost.
…and every recall compounds — connections strengthen with use, the cycle continues.
git clone kagura-ai/memory-cloud docker compose up -d Connect and start