Skip to content

Conversation

@kevmyung
Copy link

Description

Add properties to track actual context size (input tokens) during agent invocations.

Problem

Currently, usage.inputTokens only provides the accumulated sum across all cycles. There's no way to know the actual current context size for proactive context window management.

Solution

Add three new properties to AgentInvocation and EventLoopMetrics:

  • initial_input_tokens: Input tokens from the first LLM call
  • final_input_tokens: Input tokens from the last LLM call (actual context size)
  • context_growth_tokens: Growth during invocation (final - initial)

Benefits

  • No additional API calls (leverages existing cycles[].usage data)
  • No latency impact
  • Model-agnostic
  • Backward compatible

Related Issues

Closes #1197

Documentation PR

N/A

Type of Change

  • New feature (non-breaking change which adds functionality)

Testing

  • I ran hatch run prepare

Checklist

  • I have read the CONTRIBUTING document
  • I have added any necessary tests that prove my fix is effective or my feature works
  • I have updated the documentation accordingly
  • I have added an appropriate example to the documentation to outline the feature, or no new docs are needed
  • My changes generate no new warnings
  • Any dependent changes have been merged and published

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

Add properties to track actual context size (input tokens) per invocation:
- initial_input_tokens: first LLM call's inputTokens
- final_input_tokens: last LLM call's inputTokens (actual context size)
- context_growth_tokens: growth during invocation (final - initial)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[FEATURE] Track agent.messages token size

1 participant