Context Window
The maximum amount of text (code, conversation, files) an AI model can process at once.
The Full Picture
The context window is measured in tokens and determines how much information an AI can "see" simultaneously. A larger context window means the AI can understand more of your codebase at once. Claude has a 200K token context window (~150K words), while GPT-4 ranges from 8K to 128K.
When your project exceeds the context window, the AI loses track of earlier code — which is why tools like Cursor use smart retrieval to feed only the relevant files.
Related Terms
The basic units AI models use to process text — roughly 1 token per 0.75 words in English.
Retrieval-Augmented Generation — feeding an AI relevant documents so it can answer based on your actual data.
The skill of writing effective instructions for AI models to get the output you actually want.
Was this helpful?
Want to go deeper? I write about the real gaps vibe coding leaves behind.