OpenAI has built a version of GPT-4, its latest text-generating model, that can “remember” roughly 50 pages of content thanks to a greatly expanded context window.
That might not sound significant. But it’s five times as much information as the vanilla GPT-4 can hold in its “memory” and eight times as much as GPT-3.
“The model is able to flexibly use long documents,” Greg Brockman, OpenAI co-founder and president, said during a live demo this afternoon. “We want to see what kinds of applications [this enables].”
Where it concerns text-generating AI, the context window refers to the text the model considers before generating additional text. While models like GPT-4 “learn” to write by training on billions of examples of text, they can only consider a small fraction of that text at a time — determined chiefly by the size of their context window.
Models with small context windows tend to “forget” the content of

  • 1678840344
  • post author: Kyle Wiggers

Read the original story