dual-stream-architecture
Dual-stream event publishing combining Kafka for durability with Redis Pub/Sub for real-time delivery. Use when building event-driven systems needing both guaranteed delivery and low-latency updates. Triggers on dual stream, event publishing, Kafka Redis, real-time events, pub/sub, streaming architecture.
Why use this skill?
Optimize your event-driven systems by combining Kafka's durability with Redis's real-time performance. Install the dual-stream-architecture skill today.
Install via CLI (Recommended)
clawhub install openclaw/skills/skills/wpank/dual-stream-architectureWhat This Skill Does
The dual-stream-architecture skill provides a robust pattern for event-driven systems that require the best of both worlds: persistence and performance. It works by implementing a dual-publish strategy where events are simultaneously sent to Kafka and Redis. The Kafka stream ensures data durability and fault tolerance, acting as the primary source of truth, while the Redis Pub/Sub stream provides a low-latency pathway for real-time applications like live dashboards, notification systems, or WebSocket-based user interfaces. This architectural pattern prevents the common trade-off between reliable background processing and immediate user feedback.
Installation
To integrate this architecture into your workspace, run the following command in your terminal:
npx clawhub@latest install dual-stream-architecture
Alternatively, for specific repository management:
clawhub install openclaw/skills/skills/wpank/dual-stream-architecture
Use Cases
- Real-time Dashboards: Displaying live metrics or event logs as they occur without waiting for slow database writes.
- WebSocket Backends: Pushing live updates to browser clients via Redis, while keeping the main logic processed by Kafka consumers.
- Complex Event Processing: When you need a fast-path for notification delivery while concurrently storing full event payloads for auditing and historical analysis.
- Scalable Notification Systems: Routing high-volume updates to distributed services while maintaining a persistent record of every interaction.
Example Prompts
- "OpenClaw, implement the DualPublisher struct in my current backend and configure it to target our existing Redis cluster."
- "Show me how to refactor my current Kafka-only publisher to use the dual-stream pattern for my real-time analytics module."
- "Help me define a custom channel naming convention using the events:{source_type}:{source_id} format to replace my hardcoded string identifiers."
Tips & Limitations
- Best-Effort Redis: The current implementation treats Redis as 'best-effort'. This is intentional to ensure Kafka writes are never blocked by Redis availability, but be aware that Redis messages can be dropped during network hiccups.
- Payload Sizing: Keep your Redis payloads lightweight. Since Kafka is your durable store, store the full event payload there and send only necessary metadata or IDs through Redis to keep the real-time stream fast and efficient.
- Channel Hygiene: Use consistent naming conventions. The provided pattern (events:{source_type}:{source_id}) is optimized for pattern matching in Redis (e.g., PSUBSCRIBE events:user:*).
Metadata
Not sure this is the right skill?
Describe what you want to build — we'll match you to the best skill from 16,000+ options.
Find the right skillPaste this into your clawhub.json to enable this plugin.
{
"plugins": {
"official-wpank-dual-stream-architecture": {
"enabled": true,
"auto_update": true
}
}
}Tags(AI)
Flags: network-access, code-execution
Related Skills
mermaid-diagrams
Create software diagrams using Mermaid syntax. Use when users need to create, visualize, or document software through diagrams including class diagrams, sequence diagrams, flowcharts, ERDs, C4 architecture diagrams, state diagrams, git graphs, and other diagram types. Triggers include requests to diagram, visualize, model, map out, or show the flow of a system.
api-design-principles
Skill by wpank
auto-context
Automatically read relevant context before major actions. Loads TODO.md, roadmap.md, handoffs, task plans, and other project context files so the AI operates with full situational awareness. Use when starting a task, implementing a feature, refactoring, debugging, planning, or resuming a session.
clear-writing
Write clear, concise prose for humans — documentation, READMEs, API docs, commit messages, error messages, UI text, reports, and explanations. Combines Strunk's rules for clearer prose with technical documentation patterns, structure templates, and review checklists.
track-performance
Track the performance of Uniswap LP positions over time — check which positions need attention, are out of range, or have uncollected fees. Use when the user asks how their positions are doing.