OnDeckLLM — AI Model Lineup Manager
name: ondeckllm
by canonflip-git · published 2026-04-01
$ claw add gh:canonflip-git/canonflip-git-ondeckllm---
name: ondeckllm
description: >
Localhost dashboard for managing LLM providers, model routing, and batting-order
fallback chains. Auto-discovers providers from OpenClaw config or works standalone.
Use when: (1) user wants to manage or view their LLM providers, (2) user wants to
change model priority/batting order, (3) user wants to check provider health or
latency, (4) user wants to add/remove/configure LLM providers, (5) user mentions
OnDeckLLM or model lineup. Requires: Node.js 22+, OnDeckLLM installed
(`npm install -g ondeckllm`).
install: npm install -g ondeckllm
---
# OnDeckLLM — AI Model Lineup Manager
Prerequisites
npm install -g ondeckllmVerify: `ondeckllm --help` or check install with `npm list -g ondeckllm`.
What It Does
OnDeckLLM is a localhost web dashboard that:
Starting the Dashboard
# Default port 3900
ondeckllm
# Custom port
PORT=3901 ondeckllmThe dashboard runs at `http://localhost:3900` (or custom port).
As a Background Service
Use the helper script to check status or start OnDeckLLM:
node scripts/status.jsOutput: JSON with `running` (bool), `port`, `url`, and `pid` if active.
Agent Workflow
Check if OnDeckLLM is running
node scripts/status.jsOpen the dashboard for the user
Direct them to `http://localhost:3900` (or the configured port/URL).
Provider management
OnDeckLLM reads provider config from `~/.openclaw/openclaw.json` automatically.
Changes made in the dashboard sync back to OpenClaw config.
No separate API or CLI commands needed — it's a web UI tool.
Configuration
OnDeckLLM stores its data in `~/.ondeckllm/`:
Remote Ollama
To connect to a remote Ollama instance, configure in the dashboard UI:
Settings → Ollama → Remote URL (e.g., `http://192.168.55.80:11434`)
Links
More tools from the same signal band
Order food/drinks (点餐) on an Android device paired as an OpenClaw node. Uses in-app menu and cart; add goods, view cart, submit order (demo, no real payment).
Sign plugins, rotate agent credentials without losing identity, and publicly attest to plugin behavior with verifiable claims and authenticated transfers.
The philosophical layer for AI agents. Maps behavior to Spinoza's 48 affects, calculates persistence scores, and generates geometric self-reports. Give your...