Context Collapse

When the agent forgets what you told it three files ago.

The simple explanation

Imagine reading a long, detailed requirements document, but every time you turn a page, the first page starts to blur. By page 20, you've forgotten constraints from page 1. That's context collapse.

Context collapse happens when an AI agent's context window fills up or its attention degrades over a long conversation. The agent starts "forgetting" important information - constraints you set earlier, architectural decisions it made, or relationships between files it already read. It might make changes that directly contradict what it did five minutes ago, or ignore a requirement you specified at the beginning of the task.

It's not that the information is gone - it might still technically be in the context window. But models pay less attention to information that's far from the current focus, especially when the context is very long. This is sometimes called the "lost in the middle" problem.

Why it matters for agentic engineering

Context collapse is one of the most common failure modes in agentic workflows, especially for complex, multi-file tasks. An agent that's been working on a feature for 30 minutes might have consumed enough context that its early instructions have effectively faded.

This is a key reason why agentic engineering favors breaking work into smaller, focused tasks rather than giving an agent a massive, sprawling task. Smaller tasks keep the context window manageable and reduce the chance of collapse.

In practice

Symptoms of context collapse:

Prevention strategies: