Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
How to run LLMs locally on mobile devices (with Gemma and On-Device AI tools)
(opens in new tab)
(annjose.com)
1 points
annjose
9mo ago
1 comments
Share
How to run LLMs locally on mobile devices (with Gemma and On-Device AI tools) | Better HN
1 comments
default
newest
oldest
incomingpain
9mo ago
Any models that can run on a mobile device will likely be 8B or smaller, will have very noticable hallucination problems.
j
/
k
navigate · click thread line to collapse