Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
undefined | Better HN
0 points
layer8
1y ago
0 comments
Share
Same. And the next step is that it must feed back into training, to form long-term memory and to continually learn.
0 comments
default
newest
oldest
zoogeny
1y ago
I analogize this with sleep. Perhaps that is what is needed, 6 hours offline per day to LoRa the base model on some accumulated context from the day.
dev0p
1y ago
LLMs need to sleep too. Do they dream of electric sheep?
j
/
k
navigate · click thread line to collapse