Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
0 points
s17n
3y ago
0 comments
Share
Running inference on one of these models takes like a GPU minute, so they can't just let the public use them.
undefined | Better HN
0 comments
default
newest
oldest
throwawaaaaay17
3y ago
They can absolutely do for it if they charge the public the cost of GPU time.
yeldarb
3y ago
Can't be this; Google Colab gives out tons of free GPU usage.
astrange
3y ago
Google has a lot of GPUs, but even so Colab seems like it’s a lot cheaper than it should be. You can get some very good GPUs on the paid plan.
j
/
k
navigate · click thread line to collapse