Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
undefined | Better HN
0 points
tpae
4mo ago
0 comments
Share
I've been building with local AI, on Apple Silicon. It's only 8mb, but runs 30% faster than Ollama.
https://github.com/dinoki-ai/osaurus
0 comments
default
newest
oldest
mattfrommars
4mo ago
did you really solo develop this entire application? including dinoki-ai which appears to be SAAS?
j
/
k
navigate · click thread line to collapse