Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
undefined | Better HN
0 points
ottah
4d ago
0 comments
Share
That's actually pretty cool, but I'd hate to freeze a models weights into silicon without having an incredibly specific and broad usecase.
0 comments
default
newest
oldest
patapong
3d ago
Depends on cost IMO - if I could buy a Kimi K2.5 chip for a couple of hundred dollars today I would probably do it.
whatever1
3d ago
I mean if it was small enough to fit in an iPhone why not? Every year you would fabricate the new chip with the best model. They do it already with the camera pipeline chips.
superxpro12
3d ago
Sounds like just the sort of thing FGPA's were made for.
The $$$ would probably make my eyes bleed tho.
chrsw
3d ago
Current FPGAs would have terrible performance. We need some new architecture combining ASIC LLM perf and sparse reconfiguration support maybe.
0x457
3d ago
Wouldn't it be the opposite of freezing weights?
j
/
k
navigate · click thread line to collapse