A context window is the amount of text an AI model can process in a single conversation — its working memory. It is measured in tokens (roughly 0.75 words per token). GPT-4 has a 128K token context window (~96,000 words). Claude has up to 200K tokens (~150,000 words). A larger context window means the model can read longer documents, maintain longer conversations, and consider more information at once. For business use, context window size matters when you need AI to process contracts, reports, codebases, or lengthy documents in a single pass.
What is a context window in AI?
Answered by Hector Herrera