Back to Dictionary
Concepts

Context Window

The maximum amount of text (code, conversation, files) an AI model can process at once.

The Full Picture

The context window is measured in tokens and determines how much information an AI can "see" simultaneously. A larger context window means the AI can understand more of your codebase at once. Claude has a 200K token context window (~150K words), while GPT-4 ranges from 8K to 128K.

When your project exceeds the context window, the AI loses track of earlier code — which is why tools like Cursor use smart retrieval to feed only the relevant files.

Was this helpful?

Want to go deeper? I write about the real gaps vibe coding leaves behind.