Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
Unsloth creators fix universal error with gradient accumulation | Better HN
Unsloth creators fix universal error with gradient accumulation
(opens in new tab)
(unsloth.ai)
4 points
ZQ-Dev8
1y ago
2 comments
Share
2 comments
default
newest
oldest
gnabgib
1y ago
Article title:
Bugs in LLM Training - Gradient Accumulation Fix
ZQ-Dev8
OP
1y ago
seems like something pytorch maintainers would want to know about and fix asap...
j
/
k
navigate · click thread line to collapse