There are people I work with who are deep in the AI ecosystem and it's obvious what tools they're using It would not be uncharitable in any way to characterize their work as pure slop that doesn't work, buggy, untested adequately, etc.
The moment I start to feel behind I'll gladly start adopting agentic AI tools, but as things stand now, I'm not seeing any pressing need.
Comments like these make me feel like I'm being gaslit.
If this stuff was self-evidently as useful as it's being made out to be, there would be no point in constantly trying to pressure, coax and cajole people into it. You don't need to spook people into using things that are useful, they'll do it when it makes sense.
The actual use-case of LLMs is dwarfed by the massive investment bubble it has become, and it's all riding on future gains that are so hugely inflated they will leave a crater that makes the dotcom bubble look like a pothole.
One dude with an LLM should be able to write a browser fully capable of browsing the modern web or an OS from scratch in a year, right?
Chrome took at least a thousand man years i.e. 100 people working for 10 years.
I'm lowballing here: it's likely way, way more.
If ai gives 10x speedup, to reproduce Chrome as it is today would require 1 person working for 100 years, 10 people working for 10 years or 100 people working for 1 year.
Clearly, unrealistic bar to meet.
If you want a concrete example: https://github.com/antirez/flux2.c
Creator of Redis started this project 3 weeks ago and use Claude Code to vibe code this.
It works, it's fast and the code quality is as high as I've ever seen a C code base. Easily 1% percentile of quality.
Look at this one-shotted working implementation of jpeg decoder: https://github.com/antirez/flux2.c/commit/a14b0ff5c3b74c7660...
Now, it takes a skilled person to guide Claude Code to generate this but I have zero doubts that this was done at least 5x-10x faster than Antirez writing the same code by hand.