NovelAI OpenClaw Adaptor
name: novelai-openclaw-adaptor
by askkumptenchen · published 2026-04-01
$ claw add gh:askkumptenchen/askkumptenchen-novelai-openclaw-adaptor---
name: novelai-openclaw-adaptor
description: Explain how to connect NovelAI to OpenClaw through a local OpenAI-compatible shim. Use when the user wants configuration guidance for a local NovelAI adaptor, model selection, or OpenClaw `base_url` setup.
metadata: {"openclaw":{"os":["win32","linux","darwin"]}}
---
# NovelAI OpenClaw Adaptor
Use this skill to explain how a local adaptor can bridge OpenClaw's OpenAI-style API calls to NovelAI, and how to configure OpenClaw to talk to that local endpoint.
Scope
This skill is for explanation and local configuration guidance:
Do not use this skill to:
Safety rules
Follow these rules whenever this skill is used:
1. Treat installation as optional and approval-based.
2. Before suggesting installation, tell the user to verify the package source, maintainer, and repository or project page.
3. Prefer local source checkout or an already-verified package source when available.
4. Never ask the user to paste a NovelAI API key into chat.
5. Never include secrets inline in command examples.
6. If the package source cannot be verified, stop at configuration guidance and ask the user how they want to proceed.
Verification-first workflow
If the user wants to enable runtime use, follow this order:
1. Check whether the adaptor is already present locally or already installed.
2. If it is not present, explain that the package source should be verified before any installation.
3. Ask for approval before running any install command.
4. After the user has verified the source and approved installation, use the package's documented install method.
5. Prefer interactive or local-only credential entry during configuration.
Safe examples:
pip install novelai-openclaw-adaptornovelai-config initDo not use examples like:
novelai-config init --api-key "YOUR_NOVELAI_API_KEY"If the user needs help deciding whether the package is trustworthy, suggest reviewing its repository, release history, and maintainers before installation.
Supported models
Text models:
Image models:
When helping with setup, ask the user which model they want instead of assuming silently. If they do not care, recommend:
OpenClaw configuration
When explaining how to connect OpenClaw to the adaptor:
1. Set `base_url` to the local adaptor endpoint, such as `http://127.0.0.1:xxxx/v1` or `http://localhost:xxxx/v1`.
2. Set the OpenClaw model name to the adaptor-exposed model the user selected.
3. Clarify that credential handling belongs to the local adaptor configuration, not the chat.
4. If OpenClaw insists on an API key field, explain that some clients accept a placeholder value such as `sk-local`, but the real NovelAI credential should stay in the local adaptor config only.
Helpful local commands
If the user has already verified the package source and approved local usage, these commands are relevant:
novelai-config --help
novelai-shim --help
novelai-image --helpUse `novelai-config init` as the normal guided setup entry point. It should collect local configuration such as:
Image generation usage
Once the local adaptor is configured and running, image generation can use the normal OpenClaw or OpenAI-style prompt flow.
Example prompt:
`1girl, solo, masterpiece, best quality, highly detailed`
The adaptor is responsible for translating that prompt into the format expected by NovelAI.
More tools from the same signal band
Order food/drinks (点餐) on an Android device paired as an OpenClaw node. Uses in-app menu and cart; add goods, view cart, submit order (demo, no real payment).
Sign plugins, rotate agent credentials without losing identity, and publicly attest to plugin behavior with verifiable claims and authenticated transfers.
The philosophical layer for AI agents. Maps behavior to Spinoza's 48 affects, calculates persistence scores, and generates geometric self-reports. Give your...