A query in a regular search engine can at best perform like an LLM-based provider like Perplexity for simple queries.
If you have to click or browse several results forget it, makes no sense not to use an LLM that provides sources.
I just searched for "What is inherit_errexit?" at Perplexity. Eight sources were provided and none of them were the most authoritative source, which is this page in the Bash manual:
https://www.gnu.org/software/bash/manual/html_node/The-Shopt...
Whereas, when I searched for "inherit_errexit" using Google Search, the above page was the sixth result. And when I searched for "inherit_errexit" using DuckDuckGo, the above page was the third result.
I continue to believe that LLMs are favored by people who don't care about developing an understanding of subjects based on the most authoritative source material. These are people who don't read science journals, they don't read technical specifications, they don't read man pages, and they don't read a program's source code before installing the program. These are people who prioritize convenience above all else.
This makes a lot of sense to me. As a young guy in the 90's I was told that some day "everyone will be fluent in computers" and 25 years later it's just not true. 95% of my peers never developed their fluency, and my kids even less so. The same will hold try for AI, it will be what smartphones were to PCs: A dumbed down interface for people who want to USE tech not understand it.
[0]: not that I write blog post articles anyway, it's just a fantasy day dream thing that's been running through my head