calibre-metadata-apply
Apply metadata updates to existing Calibre books via calibredb over a Content server. Use for controlled metadata edits after target IDs are confirmed by a read-only lookup.
Install via CLI (Recommended)
clawhub install openclaw/skills/skills/nextaltair/calibre-metadata-applycalibre-metadata-apply
A skill for updating metadata of existing Calibre books.
Requirements
calibredbmust be available on PATH in the runtime environmentsubagent-spawn-command-builderinstalled (for spawn payload generation)pdffontsis optional/recommended for PDF evidence checks- Reachable Calibre Content server URL
http://HOST:PORT/#LIBRARY_ID
- If authentication is enabled, prefer
/home/altair/.openclaw/.env:CALIBRE_USERNAME=<user>CALIBRE_PASSWORD=<password>
- Pass
--password-env CALIBRE_PASSWORD(username auto-loads from env) - You can still override explicitly with
--username <user>. - Optional auth cache:
--save-auth(default file:~/.config/calibre-metadata-apply/auth.json)
Supported fields
Direct fields (set_metadata --field)
titletitle_sortauthors(string with&or array)author_sortseriesseries_indextags(string or array)publisherpubdate(YYYY-MM-DD)languagescomments
Helper fields
comments_html(OC marker block upsert)analysis(auto-generates analysis HTML for comments)analysis_tags(adds tags)tags_merge(defaulttrue)tags_remove(remove specific tags after merge)
Required execution flow
A. Target confirmation (mandatory)
- Run read-only lookup to narrow candidates
- Show
id,title,authors,series,series_index - Get user confirmation for final target IDs
- Build JSONL using only confirmed IDs
B. Proposal synthesis (when metadata is missing)
- Collect evidence from file extraction + web sources
- Show one merged proposal table with:
candidate,source,confidence (high|medium|low)title_sort_candidate,author_sort_candidate
- Get user decision:
approve allapprove only: <fields>reject: <fields>edit: <field>=<value>
- Apply only approved/finalized fields
- If confidence is low or sources conflict, keep fields empty
C. Apply
- Run dry-run first (mandatory)
- Run
--applyonly after explicit user approval - Re-read and report final values
Analysis worker policy
- Use
subagent-spawn-command-builderto generatesessions_spawnpayload for heavy candidate generationtaskis required.- Profile should include model/thinking/timeout/cleanup for this workflow.
- Use lightweight subagent model for analysis (avoid main heavy model)
- Keep final decisions + dry-run/apply in main
Data flow disclosure
Metadata
Not sure this is the right skill?
Describe what you want to build — we'll match you to the best skill from 16,000+ options.
Find the right skillPaste this into your clawhub.json to enable this plugin.
{
"plugins": {
"official-nextaltair-calibre-metadata-apply": {
"enabled": true,
"auto_update": true
}
}
}Related Skills
diy-pc-ingest
Ingest pasted PC parts purchase/config text (Discord message receipts, bullet lists) into Notion DIY_PC tables (PCConfig, ストレージ, エンクロージャー, PCInput). Use when the user pastes raw purchase logs/spec notes and wants the AI to classify, enrich via web search, ask follow-up questions for unknowns, and then upsert rows into the correct Notion data sources using the 2025-09-03 data_sources API.
calibre-catalog-read
Read Calibre catalog data via calibredb over a Content server, and run one-book analysis workflow that writes HTML analysis block back to comments while caching analysis state in SQLite. Use for list/search/id lookups and AI reading pipeline for a selected book.
soul-in-sapphire
Generic long-term memory (LTM) operations for OpenClaw using Notion (2025-09-03 data_sources). Use for durable memory writes/search, emotion-state ticks, journal writes, and model-controlled subagent spawn planning via local JSON presets.
subagent-spawn-command-builder
Build sessions_spawn command payloads from JSON profiles. Use when you want reusable subagent profiles (model/thinking/timeout/cleanup/agentId/label) and command-ready JSON without executing spawn.
diy-pc-ingest
Ingest pasted PC parts purchase/config text (Discord message receipts, bullet lists) into Notion DIY_PC tables (PCConfig, ストレージ, エンクロージャー, PCInput). Use when the user pastes raw purchase logs/spec notes and wants the AI to classify, enrich via web search, ask follow-up questions for unknowns, and then upsert rows into the correct Notion data sources using the 2025-09-03 data_sources API.