环境Win11WSL2Ubuntu24.04大模型Qwen Cloud百炼API Key一、查看模型列表uadminPC26:~$ openclaw models list OpenClaw 2026.4.11 (769908e) — Im like tmux: confusing at first, then suddenly you cant live without me. Model Input Ctx Local Auth Tags qwen/qwen3.5-plus textimage 977k no yes default,configured,alias:Qwen qwen/qwen3.6-plus textimage 977k no yes configured qwen/qwen3-max-2026-01-23 text 256k no yes configured qwen/qwen3-coder-next text 256k no yes configured qwen/qwen3-coder-plus text 977k no yes configured qwen/MiniMax-M2.5 text 977k no yes configured qwen/glm-5 text 198k no yes configured qwen/glm-4.7 text 198k no yes configured qwen/kimi-k2.5 textimage 256k no yes configured没有我想添加的qwen/qwen3.5-flash二、百炼选择大模型这里的最大输出长度64K上下文长度1M一会要用到。三、设置模型参数3.1 配置文件设置方式1vim ~/.openclaw/openclaw.json在models.providers.qwen.models下添加qwen3.5-flash目前OpenClawinput仅支持 text 和 image 两个值, { id: qwen3.5-flash, name: qwen3.5-flash, reasoning: false, input: [text, image], cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0 }, contextWindow: 1000000, maxTokens: 65536 }还可以在添加一个tongyi-xiaomi-analysis-pro, { id: tongyi-xiaomi-analysis-pro, name: tongyi-xiaomi-analysis-pro, reasoning: false, input: [text], cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0 }, contextWindow: 32768, maxTokens: 4096 }设置选择的模型openclaw models set qwen/qwen3.5-flash或者openclaw models set qwen/tongyi-xiaomi-analysis-pro配置文件变化重启非必须openclaw gateway restart查看模型状态openclaw models status查看模型列表uadminPC26:~$ openclaw models list OpenClaw 2026.4.11 (769908e) — If somethings on fire, I cant extinguish it—but I can write a beautiful postmortem. Model Input Ctx Local Auth Tags qwen/qwen3.5-flash textimage 977k no yes default,configured qwen/qwen3.5-plus textimage 977k no yes configured,alias:Qwen qwen/qwen3.6-plus textimage 977k no yes configured qwen/qwen3-max-2026-01-23 text 256k no yes configured qwen/qwen3-coder-next text 256k no yes configured qwen/qwen3-coder-plus text 977k no yes configured qwen/MiniMax-M2.5 text 977k no yes configured qwen/glm-5 text 198k no yes configured qwen/glm-4.7 text 198k no yes configured qwen/kimi-k2.5 textimage 256k no yes configured qwen/tongyi-xiaomi-analysis-pro text 32k no yes configured3.2 控制台设置方式2设置--AI与代理三、验证效果中间的模型窗口可以切换大模型回复下面显示了当前使用的大模型。也可以输入/model qwen/qwen3.5-flash切换模型