Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
undefined | Better HN
0 points
findjashua
2y ago
0 comments
Share
LM Studio is the easiest way to do it
0 comments
default
newest
oldest
Sabinus
2y ago
That's what I've been playing with. I can load 9 layers of a mixtral descendant into the 12gb vram for GPU and the rest into ~28gb ram for the CPU to work on. It chugs the system sometimes but the models are interestingly capable.
j
/
k
navigate · click thread line to collapse