> AI doesn’t provide directions, it navigates for you.
LLMs (try to) give you what you're asking for. If you ask for directions, you'll get something that resembles that, if you ask it to 100% navigate, that's what you get.
> and this paper demonstrates that people are likely to take answers for granted.
Could you point out where exactly this is demonstrated in this paper? As far as I can tell from the study, people who used ChatGPT for the studying did better than the ones that didn't, with no different in knowledge retention.