ClawKit Logo
ClawKitReliability Toolkit
Back to Registry
Official Verified developer tools Safety 5/5

chromadb-memory

Long-term memory via ChromaDB with local Ollama embeddings. Auto-recall injects relevant context every turn. No cloud APIs required — fully self-hosted.

Why use this skill?

Enhance your OpenClaw agent with ChromaDB Memory for self-hosted, long-term semantic recall using local Ollama embeddings. Perfect for persistent context.

skill-install — Terminal

Install via CLI (Recommended)

clawhub install openclaw/skills/skills/msensintaffar/chromadb-memory
Or

What This Skill Does

The ChromaDB Memory skill provides your OpenClaw agent with long-term semantic memory, utilizing ChromaDB for vector storage and local Ollama embeddings for natural language understanding. This means your agent can "remember" past conversations and information without needing to send data to cloud APIs, ensuring complete self-hosting and privacy. The core feature is "auto-recall," which automatically searches your ChromaDB collection for relevant memories based on the current user message and injects them into the agent's context before each turn. This allows the agent to maintain context over extended interactions and access stored knowledge dynamically. For more granular control, a chromadb_search tool is available for manual semantic searches over your collection.

Installation

To install the ChromaDB Memory skill, you first need to ensure you have the prerequisites in place:

  1. ChromaDB: Run ChromaDB, preferably using Docker:

    docker run -d --name chromadb -p 8100:8000 chromadb/chroma:latest
    
  2. Ollama: Pull an embedding model, such as nomic-embed-text:

    ollama pull nomic-embed-text
    
  3. Indexed Documents: Populate your ChromaDB collection with documents. You can use any ChromaDB-compatible indexing tool for this.

Once prerequisites are met, follow these steps:

  1. Copy Plugin Files: Copy the necessary script and configuration files to the OpenClaw extensions directory:

    mkdir -p ~/.openclaw/extensions/chromadb-memory
    cp {baseDir}/scripts/index.ts ~/.openclaw/extensions/chromadb-memory/
    cp {baseDir}/scripts/openclaw.plugin.json ~/.openclaw/extensions/chromadb-memory/
    
  2. Configure OpenClaw: Add the chromadb-memory plugin to your OpenClaw configuration file (~/.openclaw/openclaw.json). Adjust the config section to match your ChromaDB and Ollama setup, including chromaUrl, collectionName, ollamaUrl, embeddingModel, and optionally fine-tune autoRecall, autoRecallResults, and minScore.

    {
      "plugins": {
        "entries": {
          "chromadb-memory": {
            "enabled": true,
            "config": {
              "chromaUrl": "http://localhost:8100",
              "collectionName": "longterm_memory",
              "ollamaUrl": "http://localhost:11434",
              "embeddingModel": "nomic-embed-text",
              "autoRecall": true,
              "autoRecallResults": 3,
              "minScore": 0.5
            }
          }
        }
      }
    }
    
  3. Restart Gateway: Ensure the changes take effect by restarting the OpenClaw gateway:

    openclaw gateway restart
    

Use Cases

This skill is invaluable for agents that require persistent knowledge and context across multiple interactions. Potential use cases include:

  • Personal Assistants: Remembering user preferences, past appointments, or frequently requested information.
  • Customer Support Bots: Recalling previous support tickets, customer history, or product-specific knowledge bases.
  • Research Assistants: Maintaining context from long research sessions, remembering key findings, and connecting disparate pieces of information.
  • Content Creation Agents: Remembering stylistic preferences, previous drafts, or project-specific guidelines.
  • Personalized Tutors: Adapting explanations based on a student's past performance and learning history.

Example Prompts

  1. After a long discussion about project requirements: "Summarize the key technical constraints we identified for the new web app."
  2. Following a conversation where a specific API was discussed: "What was the endpoint we decided to use for user authentication?"
  3. When asking for advice on a topic previously explored: "Based on our last conversation about sustainable gardening, what are some beginner-friendly herbs?

Tips & Limitations

  • Tuning minScore and autoRecallResults: If the auto-recall is injecting too much irrelevant information ("too noisy"), increase minScore. Conversely, if important context is being missed, lower minScore or increase autoRecallResults.
  • Manual Search: For precise retrieval, the chromadb_search tool allows you to explicitly query your memory.
  • Token Cost: While auto-recall adds some tokens to the context, the overhead is generally negligible for agents with large context windows (e.g., 200K+ tokens).
  • Data Indexing: The effectiveness of this skill heavily relies on the quality and relevance of the data indexed in your ChromaDB collection. Ensure your indexing process is robust.
  • No Cloud Dependencies: A significant advantage is the 100% local operation, enhancing privacy and reducing reliance on external services. This means no API keys or cloud costs for memory storage and retrieval.
  • Embedding Model Choice: While nomic-embed-text is recommended and configured by default, you can experiment with other Ollama-compatible embedding models if needed, provided they match the dimensionality expected by your stored data.

Metadata

Stars1335
Views43
Updated2026-02-23
View Author Profile
AI Skill Finder

Not sure this is the right skill?

Describe what you want to build — we'll match you to the best skill from 16,000+ options.

Find the right skill
Add to Configuration

Paste this into your clawhub.json to enable this plugin.

{
  "plugins": {
    "official-msensintaffar-chromadb-memory": {
      "enabled": true,
      "auto_update": true
    }
  }
}

Tags

#memory#chromadb#ollama#vector-search#local#self-hosted#auto-recall
Safety Score: 5/5