Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
0 points
charcircuit
1mo ago
0 comments
Share
There are competent open source LLMs out today. They are not highly centralized.
undefined | Better HN
0 comments
default
newest
oldest
pocksuppet
1mo ago
There's one at the top of Hacker News right now, Qwen3-Coder-Next:
https://news.ycombinator.com/item?id=46872706
int_19h
1mo ago
A 80B MoE model with 3B params per activation is not a competent model regardless of what their cherry-picked benchmarks say. This reminds me of back when every other llama-7b finetune was claiming to be "GPT-4 quality".
j
/
k
navigate · click thread line to collapse