calibre-catalog-read
Read Calibre catalog data via calibredb over a Content server, and run one-book analysis workflow that writes HTML analysis block back to comments while caching analysis state in SQLite. Use for list/search/id lookups and AI reading pipeline for a selected book.
Install via CLI (Recommended)
clawhub install openclaw/skills/skills/nextaltair/calibre-catalog-readcalibre-catalog-read
Use this skill for:
- Read-only catalog lookup (
list/search/id) - One-book AI reading workflow (
export -> analyze -> cache -> comments HTML apply)
Requirements
calibredbavailable on PATH in the runtime where scripts are executed.ebook-convertavailable for text extraction.subagent-spawn-command-builderinstalled (for spawn payload generation).- Reachable Calibre Content server URL in
--with-libraryformat:http://HOST:PORT/#LIBRARY_ID
- Do not assume localhost/127.0.0.1; always pass explicit reachable
HOST:PORT. - If auth is enabled:
- Preferred: set in
/home/altair/.openclaw/.envCALIBRE_USERNAME=<user>CALIBRE_PASSWORD=<password>
- Then pass only
--password-env CALIBRE_PASSWORD(username auto-loads from env) - You can still override with
--username <user>explicitly. - Optional auth cache file:
~/.config/calibre-catalog-read/auth.json- Avoid
--save-plain-passwordunless explicitly requested.
- Avoid
- Preferred: set in
Commands
List books (JSON):
node skills/calibre-catalog-read/scripts/calibredb_read.mjs list \
--with-library "http://192.168.11.20:8080/#Calibreライブラリ" \
--password-env CALIBRE_PASSWORD \
--limit 50
Search books (JSON):
node skills/calibre-catalog-read/scripts/calibredb_read.mjs search \
--with-library "http://192.168.11.20:8080/#Calibreライブラリ" \
--password-env CALIBRE_PASSWORD \
--query 'series:"中公文庫"'
Get one book by id (JSON):
node skills/calibre-catalog-read/scripts/calibredb_read.mjs id \
--with-library "http://192.168.11.20:8080/#Calibreライブラリ" \
--password-env CALIBRE_PASSWORD \
--book-id 3
Run one-book pipeline (analyze + comments HTML apply + cache):
uv run python skills/calibre-catalog-read/scripts/run_analysis_pipeline.py \
--with-library "http://192.168.11.20:8080/#Calibreライブラリ" \
--password-env CALIBRE_PASSWORD \
--book-id 3 --lang ja
Cache DB
Initialize DB schema:
uv run python skills/calibre-catalog-read/scripts/analysis_db.py init \
--db skills/calibre-catalog-read/state/calibre_analysis.sqlite
Check current hash state:
uv run python skills/calibre-catalog-read/scripts/analysis_db.py status \
--db skills/calibre-catalog-read/state/calibre_analysis.sqlite \
--book-id 3 --format EPUB
Main vs Subagent responsibility (strict split)
Use this split to avoid long blocking turns on chat listeners.
Main agent (fast control plane)
- Validate user intent and target
book_id. - Confirm subagent runtime knobs:
model,thinking,runTimeoutSeconds. - Start subagent and return a short progress reply quickly.
- After subagent result arrives, run DB upsert + Calibre apply.
- Report final result to user.
Subagent (heavy analysis plane)
- Read extracted source payload.
- Generate analysis JSON strictly by schema.
- Do not run metadata apply or user-facing channel actions.
Metadata
Not sure this is the right skill?
Describe what you want to build — we'll match you to the best skill from 16,000+ options.
Find the right skillPaste this into your clawhub.json to enable this plugin.
{
"plugins": {
"official-nextaltair-calibre-catalog-read": {
"enabled": true,
"auto_update": true
}
}
}Related Skills
diy-pc-ingest
Ingest pasted PC parts purchase/config text (Discord message receipts, bullet lists) into Notion DIY_PC tables (PCConfig, ストレージ, エンクロージャー, PCInput). Use when the user pastes raw purchase logs/spec notes and wants the AI to classify, enrich via web search, ask follow-up questions for unknowns, and then upsert rows into the correct Notion data sources using the 2025-09-03 data_sources API.
soul-in-sapphire
Generic long-term memory (LTM) operations for OpenClaw using Notion (2025-09-03 data_sources). Use for durable memory writes/search, emotion-state ticks, journal writes, and model-controlled subagent spawn planning via local JSON presets.
subagent-spawn-command-builder
Build sessions_spawn command payloads from JSON profiles. Use when you want reusable subagent profiles (model/thinking/timeout/cleanup/agentId/label) and command-ready JSON without executing spawn.
calibre-metadata-apply
Apply metadata updates to existing Calibre books via calibredb over a Content server. Use for controlled metadata edits after target IDs are confirmed by a read-only lookup.
diy-pc-ingest
Ingest pasted PC parts purchase/config text (Discord message receipts, bullet lists) into Notion DIY_PC tables (PCConfig, ストレージ, エンクロージャー, PCInput). Use when the user pastes raw purchase logs/spec notes and wants the AI to classify, enrich via web search, ask follow-up questions for unknowns, and then upsert rows into the correct Notion data sources using the 2025-09-03 data_sources API.