Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
undefined | Better HN
0 points
scottndecker
3mo ago
0 comments
Share
Still 256K input tokens. So disappointing (predictable, but disappointing).
0 comments
default
newest
oldest
coder543
3mo ago
https://platform.openai.com/docs/models/gpt-5.2
400k, not 256k.
nathants
3mo ago
400 - 128 = 272. Codex cli source.
coder543
3mo ago
If you want to be able to generate up to 128k tokens in one go successfully, then yes, that math checks out.
htrp
3mo ago
much harder to train longer context inputs
j
/
k
navigate · click thread line to collapse