SkillJavaScriptv1.0.0

chinese-llm-router

Route your OpenClaw conversations to the best Chinese AI models — no config headaches, just pick and chat.

1,271 downloads
xdd-xund
Updated Feb 16, 2026

Chinese LLM Router

Route your OpenClaw conversations to the best Chinese AI models — no config headaches, just pick and chat.

What It Does

Gives your OpenClaw instant access to all major Chinese LLMs through a single unified interface:

  • DeepSeek (V3.2 / R1) — Best open-source reasoning, dirt cheap
  • Qwen (Qwen3-Max / Qwen3-Max-Thinking / Qwen3-Coder-Plus) — Alibaba's flagship, strong all-rounder
  • GLM (GLM-5 / GLM-4.7) — Zhipu AI, top-tier coding & agent tasks
  • Kimi (K2.5 / K2.5-Thinking) — Moonshot AI, great for long context & vision
  • Doubao Seed 2.0 (Pro / Lite / Mini) — ByteDance, fast & cheap
  • MiniMax (M2.5) — Lightweight powerhouse, runs locally too
  • Step (3.5 Flash) — StepFun, blazing fast inference
  • Baichuan (Baichuan4-Turbo) — Strong Chinese language understanding
  • Spark (v4.0 Ultra) — iFlytek, speech & Chinese NLP specialist
  • Hunyuan (Turbo-S) — Tencent, WeChat ecosystem integration

Quick Start

Tell your OpenClaw:

Use DeepSeek V3.2 for this conversation

Or ask it to pick the best model:

Which Chinese model is best for coding? Switch to it.

Commands

CommandWhat it does
list modelsShow all available Chinese LLMs with status
use <model>Switch to a specific model
compare <models>Compare capabilities & pricing
recommend <task>Get model recommendation for a task type
test <model>Send a test prompt to verify connectivity
statusCheck which models are currently accessible

Model Selection Guide

TaskRecommended ModelWhy
General chatQwen3-MaxBest all-rounder, strong Chinese
CodingGLM-5 / Kimi K2.5Top coding benchmarks
Math & reasoningDeepSeek R1Purpose-built for reasoning
Long documentsKimi K2.5 (128K) / DeepSeek V3.2 (1M)Massive context windows
Fast & cheapStep 3.5 Flash / Doubao Seed 2.0 MiniSub-second latency
Creative writingQwen3-Max / Doubao Seed 2.0 ProRich Chinese expression
Agent tasksGLM-5 / Qwen3-MaxBest tool-use support

Configuration

The skill reads API keys from environment or from ~/.chinese-llm-router/config.json:

{
  "providers": {
    "deepseek": {
      "apiKey": "sk-xxx",
      "baseUrl": "https://api.deepseek.com/v1",
      "models": ["deepseek-chat", "deepseek-reasoner"]
    },
    "qwen": {
      "apiKey": "sk-xxx",
      "baseUrl": "https://dashscope.aliyuncs.com/compatible-mode/v1",
      "models": ["qwen3-max", "qwen3-max-thinking", "qwen3-coder-plus"]
    },
    "glm": {
      "apiKey": "xxx.xxx",
      "baseUrl": "https://open.bigmodel.cn/api/paas/v4",
      "models": ["glm-5", "glm-4-plus"]
    },
    "kimi": {
      "apiKey": "sk-xxx",
      "baseUrl": "https://api.moonshot.cn/v1",
      "models": ["kimi-k2.5", "kimi-k2.5-thinking"]
    },
    "doubao": {
      "apiKey": "xxx",
      "baseUrl": "https://ark.cn-beijing.volces.com/api/v3",
      "models": ["doubao-seed-2.0-pro", "doubao-seed-2.0-lite", "doubao-seed-2.0-mini"]
    },
    "minimax": {
      "apiKey": "xxx",
      "baseUrl": "https://api.minimax.chat/v1",
      "models": ["minimax-m2.5"]
    },
    "step": {
      "apiKey": "xxx",
      "baseUrl": "https://api.stepfun.com/v1",
      "models": ["step-3.5-flash"]
    },
    "baichuan": {
      "apiKey": "xxx",
      "baseUrl": "https://api.baichuan-ai.com/v1",
      "models": ["baichuan4-turbo"]
    },
    "spark": {
      "apiKey": "xxx",
      "baseUrl": "https://spark-api-open.xf-yun.com/v1",
      "models": ["spark-v4.0-ultra"]
    },
    "hunyuan": {
      "apiKey": "xxx",
      "baseUrl": "https://api.hunyuan.cloud.tencent.com/v1",
      "models": ["hunyuan-turbo-s"]
    }
  },
  "default": "qwen3-max",
  "fallback": ["deepseek-chat", "doubao-seed-2.0-pro"]
}

Setup

  1. Get API keys from the providers you want (most offer free tiers):

  2. Run the setup script:

    node scripts/setup.js
    
  3. Done! Your OpenClaw can now use any configured model.

Pricing Reference (Feb 2026)

ModelInput (¥/M tokens)Output (¥/M tokens)Notes
DeepSeek V3.2¥0.5 (cache ¥0.1)¥2.0Cheapest flagship
Qwen3-Max¥2.0¥6.0Free tier available
GLM-5¥5.0¥5.0Just launched, may change
Kimi K2.5¥2.0¥6.0Open source, self-host free
Doubao Seed 2.0 Pro¥0.8¥2.0ByteDance subsidy
Doubao Seed 2.0 Mini¥0.15¥0.3Ultra cheap
MiniMax M2.5¥1.0¥3.0Can run locally
Step 3.5 Flash¥0.7¥1.4Fastest inference

Prices as of Feb 2026. All providers offer free tiers or credits for new users.

All APIs are OpenAI-Compatible

Every provider listed uses the OpenAI chat/completions format. No special SDKs needed — just change baseUrl and apiKey.

Features

  • Auto-fallback: If one provider is down, automatically try the next
  • Cost tracking: See per-model token usage and estimated cost
  • Smart routing: Describe your task, get the best model recommendation
  • Batch compare: Send the same prompt to multiple models, compare outputs
  • Context-aware: Remembers your model preference per conversation topic

Links

Free
Installation
Reviews

Sign in to leave a review.

No reviews yet. Be the first.