feat: Support StepFun (step-3.5-flash) via OpenRouter#778
feat: Support StepFun (step-3.5-flash) via OpenRouter#778mporracindie wants to merge 1 commit intoanomalyco:devfrom
Conversation
| open_weights = true | ||
|
|
||
| [cost] | ||
| input = 0.50 |
There was a problem hiding this comment.
Might be wrong, the exact numbers are not yet advertised in OpenRouter
There was a problem hiding this comment.
We will probably have the official price next week:
Step 3.5 Flash is still dominating the#1 spot on OpenRouter Trending! Huge thanks to everyone!
Try OpenRouter for free this week!
https://x.com/StepFun_ai/status/2019257979930058903
That PR was probably a little too early 😅
| family = "step" | ||
| release_date = "2026-01-29" | ||
| last_updated = "2026-01-29" | ||
| attachment = true |
There was a problem hiding this comment.
The OpenRouter page for this model shows input_modalities: ["text"] only — no image support. This should be:
| attachment = true | |
| attachment = false |
| @@ -0,0 +1,30 @@ | |||
| name = "StepFun: Step 3.5 Flash" | |||
There was a problem hiding this comment.
To maintain the project standard, you should rename it to:
| name = "StepFun: Step 3.5 Flash" | |
| name = "Step 3.5 Flash" |
| [limit] | ||
| context = 262_144 | ||
| input = 262_144 | ||
| output = 65_536 |
There was a problem hiding this comment.
Values confirmed via the OpenRouter API (/api/v1/models): context_length: 256000 and max_completion_tokens: 256000. The input field is not part of the [limit] schema and should be removed.
| [limit] | |
| context = 262_144 | |
| input = 262_144 | |
| output = 65_536 | |
| [limit] | |
| context = 256_000 | |
| output = 256_000 |
| output = 65_536 | ||
|
|
||
| [modalities] | ||
| input = ["text", "image"] |
There was a problem hiding this comment.
| input = ["text", "image"] | |
| input = ["text"] |
| [interleaved] | ||
| field = "reasoning_content" | ||
|
|
||
| [provider] | ||
| npm = "@openrouter/ai-sdk-provider" |
There was a problem hiding this comment.
The [interleaved] block should also be removed — most reasoning models on OpenRouter (Claude, DeepSeek, Grok, Trinity, etc.) don't include it. Reasoning works correctly without this parameter.
The [provider] block is unnecessary — npm = "@openrouter/ai-sdk-provider" is already defined in OpenRouter's provider.toml. No other model in this provider includes this block.
| [interleaved] | |
| field = "reasoning_content" | |
| [provider] | |
| npm = "@openrouter/ai-sdk-provider" |
| temperature = true | ||
| knowledge = "2025-01" | ||
| tool_call = true | ||
| structured_output = true |
There was a problem hiding this comment.
No evidence in OpenRouter — the page does not mention structured output
| structured_output = true |
No description provided.