The Wise Operator, Scott Krukowski
Back to Dictionary

Context Window

How much an AI can 'remember' in a single conversation. Think of it as the AI's working memory.

A context window is the maximum amount of text an AI model can “hold in its head” during a single conversation, covering both what you send it and what it generates back. It’s measured in tokens (roughly three-quarters of a word each), and current leading models support anywhere from 128,000 to over one million tokens. The context window is the hard boundary on how much information the AI can consider at once, which is why long conversations or large codebases require careful management of what you feed into it.

The Simple Version

Imagine you’re having a conversation with someone who can only remember the last 20 minutes of what you said. If you talked about your dog 30 minutes ago, they’ve forgotten. That’s a context window: it’s the limit on how much information an AI can hold in its head at one time.

The bigger the context window, the more the AI can keep track of, like a longer conversation, a bigger document, or more code files at once.

Why It Matters

Context windows are why AI sometimes “forgets” things you told it earlier in a long conversation. It’s not being rude. It literally ran out of room. Understanding this helps you work with AI more effectively: keep conversations focused, remind it of important details, and break big tasks into smaller pieces.

Different AI models have different context window sizes. Bigger isn’t always better (it costs more and can be slower), but it matters when you’re working on complex projects.

How It’s Used on This Site

Building this site with AI tools means managing context windows constantly. When working on a complex feature, the AI needs to “see” multiple files at once. Understanding context limits is the difference between productive AI collaboration and talking to a wall.


Browse the Full Dictionary