Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
0 points
fxtentacle
2y ago
0 comments
Share
That paper says you need to control "0.1% of the training data size" for a 40% chance for one single injected prompt to fire. So that's millions of images or billions of text tokens for real-world models.
undefined | Better HN
0 comments
default
newest
oldest
talsperre
2y ago
Exactly. It is very difficult to implement these data poisoning attacks in the wild due to the size of internet data in general.
doctorpangloss
2y ago
Yeah, but the vibes man.
j
/
k
navigate · click thread line to collapse