1In-browser LLM inference engine with WebGPU and OpenAI API (opens in new tab)(blog.mlc.ai)16CharlieRuan1y ago4
2Gemma locally on iOS, Android, web browsers, and GPUs with a single framework (opens in new tab)(old.reddit.com)5CharlieRuan2y ago1
3Running LLM (phi-2) locally on latest Google Chrome Android (opens in new tab)(twitter.com)6CharlieRuan2y ago1