Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
0 points
littlestymaar
1y ago
0 comments
Share
GP is talking about context windows, not the number of token used by the tokenizer.
undefined | Better HN
0 comments
default
newest
oldest
sva_
1y ago
Somewhat confusingly, it appears the tokenizer vocabulary as well as the context length are both 128k tokens!
littlestymaar
OP
1y ago
Yup, that's why I wanted to clarify things.
j
/
k
navigate · click thread line to collapse