Member-only story

Context Window of Language Models

KEEP IN TOUCH | THE GEN AI SERIES

Rahul S
6 min readJan 19, 2024

A context window refers to the length of text an AI model can process and respond to in a given instance. It is the number of tokens the model can consider when generating responses to prompts and inputs.

A Context Window functions as the lens to the world, allowing models to perceive and interpret text.

Imagine you’re reading a book. You cannot read the whole book at once. You can see only a few sentences at a time through a small window. Every time you move forward, previous sentences are forgotten, replaced by new ones. That is how Context Windows operate in the context of LLMs.

The span of text that a LLM can “see” at any given moment is its Context Window.

A Context Window is akin to the system’s ‘working memory’ for a particular analysis or conversation.

The context window for a large language model (LLM) like OpenAI’s GPT refers to the maximum…

--

--

Responses (1)