https://www.trychroma.com/research/context-rot
Context Rot: How Increasing Input Tokens Impacts LLM Performance | Chroma
Large Language Models (LLMs) are typically presumed to process context uniformly—that is, the model should handle the 10,000th token just as reliably as the...
context rotincreasinginputtokensimpacts
https://www.morphllm.com/context-rot
Context Rot: Why LLMs Degrade as Context Grows (Complete Guide) | Morph
Context rot is the measurable performance degradation LLMs experience as input length increases. Chroma tested 18 frontier models and found every one gets...
context rotcomplete guidellmsdegrademorph