As someone who is building a “Yelp clone” of sorts, this amounts to about 1/1,000,000,000 of the amount of work to actually build a Yelp clone.
I get that this could be helpful for absolute beginners who just want guidance on the very first steps to building any basic CRUD app.
But there are very prominent people in ML who are actively claiming that programming itself will be taken over or whatever by ML soon (Sam Altman, Gwern), and who I think for some odd reason don’t realize that what GPT-3 is doing here would take any average junior software developer about 30 minutes to do tops. Meanwhile no matter how many junior engineers you had they could never clone Yelp (maybe after many years and a few re-writes, ie, they become senior) - it would take more like a team of 10 very senior software engineers multiple years to deliver anything close to a Yelp clone. And it’s not just scaling, the first steps are the easiest parts and the best documented.
It’s akin to an ML art program drawing a single wobbly paint stroke and then writing a blog post about it “drawing all the strokes to make a Picasso clone”.
Except it’s also even harder than that. Because art can be sloppier and generally deals with a simple set of skills put together in forms that aren’t hard to study. Meanwhile software one tiny mistake in one area could break the whole system, and you need deep logical reasoning and tons of iterative problem solving with long term memory to debug it.
Wrong twice - Product is half the battle, and it’s not easy.
The bare minimum to compete would be native and web apps with good UX, competitive reviews and search, and better social features. Then you can fight the other half.
I feel a bit lied to with the bait and switch. ^^
FYI - I haven’t run this. Never been strong in terminal and I have a tendency to break things in there already.
Running GPT-3 written code in the terminal seems like something best left to the experts…
Or maybe I’m just a chicken.