Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
undefined | Better HN
0 points
andai
14d ago
0 comments
Share
I remember reading tht hallucination is still a problem even with perfect context. You build a theoretical perfect RAG, give the LLM the exact correct information, and it will still make mistakes surprisingly often.
0 comments
default
newest
oldest
Natfan
13d ago
this was my experience as of about 6 months ago, and i don't believe that hallucinating is a solved problem as of yet
j
/
k
navigate · click thread line to collapse