Skip to content

feat: add GLM-4.7-Flash to OpenCode provider#820

Open
ClintEQ wants to merge 1 commit intoanomalyco:devfrom
ClintEQ:add-glm-4.7-flash-opencode
Open

feat: add GLM-4.7-Flash to OpenCode provider#820
ClintEQ wants to merge 1 commit intoanomalyco:devfrom
ClintEQ:add-glm-4.7-flash-opencode

Conversation

@ClintEQ
Copy link

@ClintEQ ClintEQ commented Feb 6, 2026

Summary

  • Adds glm-4.7-flash to the OpenCode (Zen) provider
  • Same model definition as zhipuai/glm-4.7-flash — free, reasoning-capable, 200K context flash model from Zhipu AI
  • This enables VoltCode users to use GLM-4.7-Flash as a small/auxiliary model via the Voltropy endpoint

🤖 Generated with Claude Code

Same model definition as zhipuai/glm-4.7-flash — free, reasoning-capable,
200K context flash model from Zhipu AI.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant