Kagura Memory Cloud
MCP-native LLM Knowledge Base Hebbian

Shared memory & knowledge base for AI.

Claude, Cursor, Gemini CLI — every MCP-capable client shares the same memory. Kagura Memory Cloud turns conversations into lasting knowledge.

One line to connect (Claude Code)
$ claude mcp add --transport http kagura-memory https://memory.kagura-ai.com/mcp

Cursor and Gemini CLI use the same MCP config. Self-hosted? Just swap the URL.

Self-hosted
Open-source
Secure & private

Sound familiar?

Explaining the same context to Claude — again.

Searching chat history for an answer you know you got — somewhere.

Knowledge trapped in one person's head, invisible to the team.

You don't want your AI conversations to go to waste.

Memory Cloud's answer

Turn AI conversations into memory. What you've said once, every AI remembers.

Auto-memory

Conversations automatically become searchable knowledge. No manual saving.

Beyond RAG

Inspired by Karpathy's LLM Wiki — your AI compiles, indexes, and refines knowledge as you use it. Not just retrieval — a living knowledge base.

Cross-platform

Memories shared across Claude, Cursor, Gemini CLI, and any MCP-capable client, plus REST API.

Team sharing

Personal insights become team assets. Shared contexts, role-based access.

Memory turns AI into an agent.

And memory that compounds becomes a knowledge base.

An AI without memory is a disposable tool.

An AI with memory is a growing partner.

Living Knowledge Base

How knowledge grows

Four steps from raw input to a compounding knowledge base. Inspired by Karpathy's LLM Wiki pattern.

  1. Capture

    Conversations, files, and external sources flow in via MCP, REST API, or auto-sync.

  2. Compile

    Your AI structures each fact into a 3-layer schema — summary, context, content. Sleep Maintenance consolidates nightly.

  3. Index & Query

    Triple index — keyword (BM25), semantic (vector), and relational (graph). Hybrid Search with AI Reranker.

  4. Compound

    Hebbian learning strengthens connections every time you recall — your knowledge base gets smarter without LLM cost.

…and every recall compounds — connections strengthen with use, the cycle continues.

Built for how you work

Developers

Before Re-explaining your codebase to Claude every session
After Claude remembers your architecture, patterns, and past debugging sessions

Researchers

Before Paper summaries and notes scattered across tools
After Every insight builds on the last — connections emerge automatically

Teams

Before Knowledge trapped in individual chat histories
After Shared context means every team member's AI is up to speed

Up and running in minutes

Cloud

Invite-only
  1. 1 Request beta access (email or sponsor)
  2. 2 Once approved, sign in at memory.kagura-ai.com
  3. 3 Connect your AI client via MCP

Self-hosted (Docker)

  1. 1 git clone kagura-ai/memory-cloud
  2. 2 docker compose up -d
  3. 3 Connect and start
View setup guide

Your next conversation starts a memory.