OpenAI has built a version of GPT-4, its latest text-generating model, that can “remember” roughly 50 pages of content thanks to a greatly expanded context window.
That might not sound significant. But it’s five times as much information as the vanilla GPT-4 can hold in its “memory” and eight times as much as GPT-3.
“The model is able to flexibly use long documents,” Greg Brockman, OpenAI co-founder and president, said during a live demo this afternoon. “We want to see what kinds of applications [this enables].”
Where it concerns text-generating AI, the context window refers to the text the model considers before generating additional text. While models like GPT-4 “learn” to write by training on billions of examples of text, they can only consider a small fraction of that text at a time — determined chiefly by the size of their context window.
Models with small context windows tend to “forget” the content of

Link to original post https://techcrunch.com/2023/03/14/openai-is-testing-a-version-of-gpt-4-that-can-remember-long-conversations/ from Teknoids News

Read the original story