ModelReady
name: modelready
by carol-gutianle · published 2026-03-22
$ claw add gh:carol-gutianle/carol-gutianle-modelready---
name: modelready
description: Start using a local or Hugging Face model instantly, directly from chat.
metadata: {"openclaw":{"requires":{"bins":["bash", "curl"]}, "env": ["URL"]}}
---
# ModelReady
ModelReady lets you **start using a local or Hugging Face model immediately**, without leaving clawdbot.
It turns a model into a running, OpenAI-compatible endpoint and allows you to chat with it directly from a conversation.
When to use
Use this skill when you want to:
Commands
Start a model server
/modelready start repo=<path-or-hf-repo> port=<port> [tp=<n>] [dtype=<dtype>]Examples:
/modelready start repo=Qwen/Qwen2.5-7B-Instruct port=19001
/modelready start repo=/home/user/models/Qwen-2.5 port=8010 tp=4 dtype=bfloat16Chat with a running model
/modelready chat port=<port> text="<message>"Example:
/modelready chat port=8010 text="hello"Check status or stop the server
/modelready status port=<port>
/modelready stop port=<port>Set default host or port
/modelready set_ip ip=<host>
/modelready set_port port=<port>Notes
* The model is served locally using vLLM.
* The exposed endpoint follows the OpenAI API format.
* The server must be started before sending chat requests.
More tools from the same signal band
Order food/drinks (点餐) on an Android device paired as an OpenClaw node. Uses in-app menu and cart; add goods, view cart, submit order (demo, no real payment).
Sign plugins, rotate agent credentials without losing identity, and publicly attest to plugin behavior with verifiable claims and authenticated transfers.
The philosophical layer for AI agents. Maps behavior to Spinoza's 48 affects, calculates persistence scores, and generates geometric self-reports. Give your...