Anthropic has upped the ante for how much information a large language model (LLM) can consume at once, announcing on Tuesday that its just-released Claude 2.1 has a context window of 200,000 tokens. That’s roughly the equivalent of 500,000 words or more than 500 printed pages of information, Anthropic said.
The latest Claude version also is more accurate than its predecessor, has a lower price, and includes beta tool use, the company said in its announcement.
To read this article in full, please click here
Source:: Computerworld