Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
undefined | Better HN
0 points
janalsncm
11mo ago
0 comments
Share
You mentioned it took 100 gpu hours, what gpu did you train on?
0 comments
default
newest
oldest
ollin
11mo ago
Mostly 1xA10 (though I switched to 1xGH200 briefly at the end, lambda has a sale going). The network used in the post is very tiny, but I had to train a really long time w/ large batch to get somewhat-stable results.
j
/
k
navigate · click thread line to collapse