Yes, I use search engine(s) constantly - namely Kagi, which really does feel like Google used to.
I tried using LLMs for a recent project of mine when I was trying to figure out if something was possible, and they were actively misleading, every time. My issue for this project was that what I was asking for
did end up not being currently possible, but LLMs wouldn't tell me that and would make up incorrect ways to solve my problem, since they didn't want to tell me it couldn't be done.
Really, these days, either I know some resource exists and I want to find it, in which case a search engine makes much more sense than an LLM which might hallucinate, or I want to know if something is possible / how to do it, and the LLM will again hallucinate an incorrect way to do it.
I've only found LLMs useful for translation, transcription, natural language interface, etc.