News
Anthropic’s developers recently upgraded the AI model Claude Sonnet 4 to support up to 1 million tokens of context, thereby ...
Anthropic has expanded the capabilities of its Claude Sonnet 4 AI model to handle up to one million tokens of context, five ...
The model’s usage share on AI marketplace OpenRouter hit 20 per cent as of mid-August, behind only Anthropic’s coding model.
Anthropic upgrades Claude Sonnet 4 to a 1M token context window and adds memory, enabling full codebase analysis, long ...
Anthropic’s latest move to expand the context window, now in public beta, might encourage Google Gemini users to give it ...
Anthropic has expanded Claude Sonnet 4’s context window to 1 million tokens, matching OpenAI’s GPT-4.1 and enhancing its ability to process large code bases and document sets in one request.
The company today revealed that Claude Sonnet 4 now supports up to 1 million tokens of context in the Anthropic API — a five-fold increase over the previous limit.
1d
Every on MSNSuper Sonnet 4
by Every Staffin Context WindowHello, and happy Sunday! Was this newsletter forwarded to you? Sign up to get it in your inbox. Knowledge base "How to Build a Career That Thrives Alongside AI" by ...
Anthropic has upgraded Claude Sonnet 4 with a 1M token context window, competing with OpenAI's GPT-5 and Meta's Llama 4.
What's New: Tier 4 Anthropic API users are getting access to an extended 1M context window.
Anthropic's popular coding model just became a little more enticing for developers with a million-token context window.
Claude Sonnet 4 can now support up to one million tokens of context, marking a fivefold increase from the prior 200,000, ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results