Doubleword Api
Skill by pjb157
Why use this skill?
Learn to manage large-scale asynchronous AI inference jobs using the Doubleword API skill for OpenClaw. Streamline your batch workloads today.
Install via CLI (Recommended)
clawhub install openclaw/skills/skills/pjb157/doubleword-apiWhat This Skill Does
The Doubleword API skill enables OpenClaw users to interface with the Doubleword batch processing engine (api.doubleword.ai). This skill allows for the asynchronous execution of large-scale AI inference tasks. By submitting JSONL files, users can offload heavy processing workloads that do not require immediate, real-time responses. The skill manages the lifecycle of these batch jobs, from initial file creation and uploading to tracking job statuses and retrieving finalized results once the processing is complete. It is fully compatible with standard OpenAI-style batch endpoints, making it highly versatile for developers and data scientists.
Installation
To install this skill in your OpenClaw environment, use the following terminal command:
clawhub install openclaw/skills/skills/pjb157/doubleword-api
This command pulls the latest stable version from the official openclaw/skills repository managed by pjb157. Ensure you have your DOUBLEWORD_API_KEY configured in your environment variables before attempting to run batch operations.
Use Cases
- Mass Data Processing: Efficiently generate summaries, sentiment analysis, or structured data extraction for thousands of documents at once.
- Cost Optimization: Utilize the 24-hour batch window to take advantage of significantly reduced pricing compared to standard real-time API calls.
- Rate Limit Mitigation: Avoid hitting real-time rate limits by distributing high-volume requests into asynchronous batch jobs.
- Background Workflows: Automate large inference tasks that can run in the background while the agent focuses on other high-priority interactive tasks.
Example Prompts
- "Use the Doubleword API to process the file 'user_data.jsonl' and monitor the batch status until the results are ready."
- "Create a batch job for model 'anthropic/claude-3-5-sonnet' using my saved file ID file_abc123 and let me know when it finishes."
- "Upload my prepared requests.jsonl file to Doubleword and initiate a new batch inference job for me."
Tips & Limitations
- File Formatting: Ensure your
.jsonlfiles are strictly formatted as newline-delimited JSON. Each object must contain the requiredcustom_id,method,url, andbodykeys. - Scaling: For extremely large workloads exceeding 200MB, split your data into multiple smaller batch files to avoid upload errors.
- Mapping: Always use descriptive
custom_idfields. This makes it significantly easier to reconcile outputs with your input source material once the batch job is finished. - Persistence: Doubleword batch files have a storage duration; be sure to download your results promptly after the job reaches a 'completed' status to prevent data loss.
Metadata
Not sure this is the right skill?
Describe what you want to build — we'll match you to the best skill from 16,000+ options.
Find the right skillPaste this into your clawhub.json to enable this plugin.
{
"plugins": {
"official-pjb157-doubleword-api": {
"enabled": true,
"auto_update": true
}
}
}Tags(AI)
Flags: external-api, file-read, file-write