Skip to content

fix(opencode): fix 1M context for Opus - compaction token counting (#154)#155

Merged
randomm merged 2 commits intodevfrom
fix/issue-154-opus-1m-context
Feb 6, 2026
Merged

fix(opencode): fix 1M context for Opus - compaction token counting (#154)#155
randomm merged 2 commits intodevfrom
fix/issue-154-opus-1m-context

Conversation

@randomm
Copy link
Owner

@randomm randomm commented Feb 6, 2026

Summary

Fixes Opus hitting 200k token limit despite context-1m header.

Changes

  • Compaction fix: Remove output tokens from count - API only enforces limit on input tokens
  • Context override: Anthropic Sonnet/Opus models get 1M context limit
  • Preserved: adaptive-thinking header and effort-based thinking for Opus 4.6

Also includes

  • New /upstream-gap custom command for analyzing fork divergence

Fixes #154

)

- Fix compaction to count only input tokens (not output/thinking)
- Context limit override for Anthropic models with context-1m header
- Restore adaptive-thinking header and effort-based thinking for Opus 4.6
@randomm randomm merged commit 6954639 into dev Feb 6, 2026
2 of 3 checks passed
@randomm randomm deleted the fix/issue-154-opus-1m-context branch February 6, 2026 08:35
@github-actions
Copy link

github-actions bot commented Feb 6, 2026

Thanks for your contribution!

This PR doesn't have a linked issue. All PRs must reference an existing issue.

Please:

  1. Open an issue describing the bug/feature (if one doesn't exist)
  2. Add Fixes #<number> or Closes #<number> to this PR description

See CONTRIBUTING.md for details.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant