Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
undefined | Better HN
0 points
kurtoid
3y ago
0 comments
Share
Silly question: how does OpenAI host/serve it?
0 comments
default
newest
oldest
magixx
3y ago
I think on professional hardware you can get 80G of memory per GPU and they can likely do memory pooling.
j
/
k
navigate · click thread line to collapse