From the Lab to Production — AI Engineering Since 2019
  • Softmax Home
  • Blog
Sign in Subscribe

LLM

A collection of 2 posts
Is LLM's Context Window is a Vanity Metric
LLM

Is LLM's Context Window is a Vanity Metric

We've been thinking about this for a while now. Every couple months some LLM provider puts out a press release — "Now supporting 1 million tokens!" Then 2 million. We're pretty sure we'll see 5 million token context windows by end of year.
13 Mar 2026 7 min read
Specialized, integrated AI chips and how they will change our industry
LLM

Specialized, integrated AI chips and how they will change our industry

Let's start with a simple experiment: go to https://chatjimmy.ai/ and ask a random question, put the quality of the answers aside, let's observe how fast this website spits out answers. It feels as if they just spit out the entire paragraph in one-go compared
02 Mar 2026 7 min read
Page 1 of 1
From the Lab to Production — AI Engineering Since 2019 © 2026
Powered by Ghost