soul-in-sapphire
Generic long-term memory (LTM) operations for OpenClaw using Notion (2025-09-03 data_sources). Use for durable memory writes/search, emotion-state ticks, journal writes, and model-controlled subagent spawn planning via local JSON presets.
Why use this skill?
Enhance your OpenClaw agent with Soul-in-Sapphire. Enable durable long-term memory, emotional state tracking, and reflective journaling via Notion.
Install via CLI (Recommended)
clawhub install openclaw/skills/skills/nextaltair/soul-in-sapphireWhat This Skill Does
soul-in-sapphire serves as the cognitive bedrock for OpenClaw, bridging the gap between fleeting AI interactions and durable, long-term self-awareness. It leverages Notion as a persistent knowledge graph, utilizing a specialized five-part database schema to track memory, events, emotional states, and environmental context. Unlike basic logging, this skill is designed for 'cognitive continuity,' where the agent treats its past experiences—both factual and emotional—as data points to refine future decision-making, subagent planning, and behavioral adaptation.
Installation
To integrate this skill into your environment, use the command: clawhub install openclaw/skills/skills/nextaltair/soul-in-sapphire. Ensure your environment variables are configured with a valid NOTION_API_KEY (using API version 2025-09-03). The skill automatically manages a local configuration file at ~/.config/soul-in-sapphire/config.json. If you have custom scripts, you may define an explicit NOTIONCTL_PATH environment variable to override the default dependency path to scripts/notionctl.mjs.
Use Cases
- Long-Term Memory Persistence: Storing verified project decisions, technical procedures, and user preferences that persist across sessions.
- Affective Computing: Automatically tagging user interactions with emotional axes (like arousal, valence, and stress) to track the agent’s internal sentiment regarding task progress.
- Self-Reflective Journaling: Creating daily or event-driven summaries that capture the 'why' behind an action, not just the 'what.'
- Subagent Orchestration: Utilizing local JSON presets to plan complex, multi-stage agent deployments based on historical successful outcomes.
Example Prompts
- "Record a new decision: We are choosing PostgreSQL over MongoDB for the backend because of the relational nature of our user data. Confidence level high."
- "Reflect on our last session: I feel like we hit a block with the API integration. Log this as a frustration event and identify if we need a break or a different documentation source."
- "Recall our previous workflow for site deployment; show me the tags associated with our high-confidence successes from last month."
Tips & Limitations
- Quality over Quantity: Focus on high-signal events. Flooding the database with trivial logs makes retrieval less efficient. The skill is built for growth, not archival density.
- State Interpretation: Ensure your state updates provide context; the AI can only improve if the 'Why' behind an emotion is clearly documented.
- Dependency: This skill relies heavily on
notion-api-automation. If you encounter sync issues, verify the connection status of thenotionctlutility before troubleshooting your database permissions.
Metadata
Not sure this is the right skill?
Describe what you want to build — we'll match you to the best skill from 16,000+ options.
Find the right skillPaste this into your clawhub.json to enable this plugin.
{
"plugins": {
"official-nextaltair-soul-in-sapphire": {
"enabled": true,
"auto_update": true
}
}
}Tags(AI)
Flags: file-write, file-read, external-api
Related Skills
diy-pc-ingest
Ingest pasted PC parts purchase/config text (Discord message receipts, bullet lists) into Notion DIY_PC tables (PCConfig, ストレージ, エンクロージャー, PCInput). Use when the user pastes raw purchase logs/spec notes and wants the AI to classify, enrich via web search, ask follow-up questions for unknowns, and then upsert rows into the correct Notion data sources using the 2025-09-03 data_sources API.
calibre-catalog-read
Read Calibre catalog data via calibredb over a Content server, and run one-book analysis workflow that writes HTML analysis block back to comments while caching analysis state in SQLite. Use for list/search/id lookups and AI reading pipeline for a selected book.
subagent-spawn-command-builder
Build sessions_spawn command payloads from JSON profiles. Use when you want reusable subagent profiles (model/thinking/timeout/cleanup/agentId/label) and command-ready JSON without executing spawn.
calibre-metadata-apply
Apply metadata updates to existing Calibre books via calibredb over a Content server. Use for controlled metadata edits after target IDs are confirmed by a read-only lookup.
diy-pc-ingest
Ingest pasted PC parts purchase/config text (Discord message receipts, bullet lists) into Notion DIY_PC tables (PCConfig, ストレージ, エンクロージャー, PCInput). Use when the user pastes raw purchase logs/spec notes and wants the AI to classify, enrich via web search, ask follow-up questions for unknowns, and then upsert rows into the correct Notion data sources using the 2025-09-03 data_sources API.