Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
undefined | Better HN
0 points
rafram
7mo ago
0 comments
Share
Ollama is not a wrapper around llama.cpp anymore, at least for multimodal models (not sure about others). They have their own engine:
https://ollama.com/blog/multimodal-models
0 comments
default
newest
oldest
iphone_elegance
7mo ago
looks like the backend is ggml, am I missing something? same diff
j
/
k
navigate · click thread line to collapse