Skip to Content
SDKvibexContext

context

Module: vibex/context

Context Management for LLM Conversations Handles token counting, message compression, and context window management

Interfaces

ContextBudget

View source

Properties:

NameTypeDescription
totalLimitnumber
systemPromptnumber
toolsnumber
completionnumber
overheadnumber
availableForMessagesnumber

Functions

estimateTokenCount

View source

Simple token estimation (4 chars ≈ 1 token) For production, use a proper tokenizer like tiktoken

function estimateTokenCount(text: string): number

Parameters:

NameTypeDescription
textstring

calculateContextBudget

View source

Calculate token budget for messages

function calculateContextBudget(totalLimit: number, systemPrompt: string, toolsText: string = "", completionTokens: number = 4000): ContextBudget

Parameters:

NameTypeDescription
totalLimitnumber
systemPromptstring
toolsTextstring(default: "")
completionTokensnumber(default: 4000)

compressMessages

View source

Compress messages to fit within context window

function compressMessages(messages: ModelMessage[], budget: ContextBudget): Promise<ModelMessage[]>

Parameters:

NameTypeDescription
messagesModelMessage[]
budgetContextBudget

validateContext

View source

Validate final context before sending to LLM

function validateContext(messages: ModelMessage[], budget: ContextBudget): { valid: boolean; reason?: string }

Parameters:

NameTypeDescription
messagesModelMessage[]
budgetContextBudget