28 February 2024

“Strategies for an Accelerating Future” by Ethan Mollick

The Original Article Is By Ethan Mollick, Professor At University Of Pennsylvania. It Can Be Found Here.

This is only the intro, scroll down for a link to the full article!

"I didn’t expect to have to update my views on the state-of-the-art in AI so soon after writing about Google’s Gemini Advanced, the first real competitor to GPT-4, but there have been two big leaps in Large Language Models this week with real practical implications.

The first has to do with memory: there is a new version of Google’s Gemini (right after the release of the previous one!), which has a context window of over a million tokens. The context window is the information that the AI can have in memory at one time, and most chatbots have been frustratingly limited, holding a couple dozen pages, at most. This is why it is very hard to use ChatGPT to write long programs or documents; it starts to forget the start of the project as its context window fills up.

But now Gemini 1.5 can hold something like 750,000 words in memory, with near-perfect recall. I fed it all my published academic work prior to 2022 — over 1,000 pages of PDFs spread across 20 papers and books — and Gemini was able to summarize the themes in my work and quote accurately from among the papers. There were no major hallucinations, only minor errors where it attributed a correct quote to the wrong PDF file, or mixed up the order of two phrases in a document. You can see how the advent of massive context windows gives AI superhuman recall and new use cases. If I asked a researcher to read through all my papers and summarize major themes, including illustrative quotes, it would take days. The AI did it in less than a minute. And Google has announced that context windows will soon reach 10 million tokens, or nearly 17,000 pages."

Click the image below, or here, to view the full original article

Share This Story, Choose Your Platform!