With AI you’ve got no idea whether something is right, it’s a probablistic thing
That’s why I don’t buy the “AI as just another abstraction layer” argument. AI is something different and we should use it differently.
By the same principle, I also think advancement in AI is different from other technological advancement. Even the invention of computer is at most on par with AI if AI is able to walk far. People always try to use the train-horse analogue but I think we will see a gloom future in the next 5-10 years -- especially when the whole world is not turning to the left, but to the right.
Now that US and China and everyone else are competing on AI, that future might come earlier than I thought.
Autocomplete and intellisense are tools first and foremost. AI is centralized into a handful of companies you are forced to pay every month for.
Autocomplete and intellisense don't care about your data. There's an inherent issue with data, privacy and ownership when it comes to AI.
If we can run useful models locally and make it generally available on consumer hardware... things would be different.
If AI writes majority of code, then we will stop seeing shortcomings of existing tools, libraries and frameworks and eventually stop having new frameworks or libraries.
Would there be rails if DHH vibe coded 25 years ago? Or would there be library like delayed_job if Tobi vibe coded back when he started Shopify?
I blame the proliferation of MBAs but regardless of cause management killed software engineering as a true craft once they co-opted what agile development means and threw proper testing out the window and wanted devs to become turn key feature factories
I work mostly on small codebases and you can smell unchecked AI codegen from a mile away. Even small changes become giant refactors.
People who push this have never had to work in such codebases. God forbid your requirements change mid project. Rarely does the sum of separate prompts without full context result in something maintainable and modifiable. You’re cooked.
It’s like communicating in a foreign language- you can do it with AI, but if it’s an essential element of your job then you’re going to invest the time to learn it so it becomes part of you.
> In a world pushing for “reflexive AI usage,” I’m advocating for something different: thoughtful, intentional collaboration with AI that preserves the essence of coding as a craft. > ... > Like Rocky, we sometimes need to step away from the comfortable, civilized environment and return to the old gym – the place where real growth happens through struggle, persistence, and focused practice.
> Because coding isn’t just about output. It’s about the journey of becoming better problem solvers, better thinkers, and better engineers. And some journeys can’t be outsourced, even to the most advanced AI.
But here’s the reality: those ideals feel increasingly out of reach. Business demands and short-term thinking rarely leave room for “intentional” or “thoughtful” work. For many of us, having time to grow as engineers is a luxury.
Worse, it’s often personal. I’ve had to carry the weight for friends in crisis, pretending two people were working just to help someone keep their job. It’s brutal—and sadly, not rare.
As AI gets more buzz, many stakeholders now think our work is overvalued. A quick AI PoC becomes “good enough” in their eyes, and we’re expected to polish it into something real—fast, cheap, and under pressure. Meanwhile, we’re constantly defending our craft against the next threat of being replaced by “cheaper” labor.
When I started out, we cared about clean code and craftsmanship. Now, I feel like I should be taking sales courses just to survive.
Today, it’s all about output. Ship faster or get replaced. Quality only matters when it’s too late—after the person who made the bad call has already cashed out.
I know this sounds pessimistic, but for many of us who aren’t in the top 1% of this industry, it’s just reality.
Thanks for the article, Christian. You’re not wrong—but I think you’re one of the few lucky enough to live that perspective. I wish you all the best, and hope you can keep enjoying that rare luxury. There will be a need for true craftsmen—especially when the rest of us have gone numb just trying to keep up.
Of course, that does not have to be true now. You can certainly do this for personal satisfaction.
But the argument in this article is a bit confused. The step that lies behind "coding" is not of lesser difficulty, on the contrary. Instead of worrying about coding, we can instead worry about the bigger picture, and all the beautiful thinking, contemplating and deadlock it entails.
Only now, we are one step closer to solving a real problem.
This is what I’d call ‘programming’. Which you’ll still be doing even if the AI is writing the code.
The question is whether you can become good/better at programming without writing code?
“Who cares, ship it, also we need this new feature next week. What do you mean it will take longer this time? Ridiculous, why didn’t you say something before?”
Likewise, the brainrot and lost knowledge, as well as possible new tech debt that fewer engineers working in their codebase understand, will eventually cause issues down the line. The same pressures will ensue causing stakeholders to ignore all the signs of degradation.
That's the reason I tried to have most of my communication (and complaints) in written and auditable form.
Thing is, this is probably 99% of the programming work of a junior dev at a place where management thinks like that.
Does it? When I trained as a schoolteacher, we were required to engage in 'reflexive practice', meaning at the end of the school day, we were expected to sit down and think about - reflect - on what had happened that day. I don't know how the Shopify CEO meant that phrase, but 'reflexive AI usage' has two conflicting meanings - it can be AI usage that is either actively or passively chosen - and we might need some better phrasing around this.
I left Shopify a couple weeks ago and Tobi is very, very all-in on AI being an integral part of all jobs at Shopify.
Tobi said that how you use AI is now an official part of your review, and that for any new recs, you need to show that the job can't be done by an ai. I left shortly after the memo so I do not know if things have changed.
Shopify also brought in a very AI CTO a few months ago that internally has been... interesting to say the least.
Also, anecdotally, the quality of code at Shopify was declining rapidly (leaderships words, not mine). All sorts of code-reds and yellows were happening to try and wrangle quality. This isn't Blind so no need for the gore and opinions, but we'll have to see how this shakes out for Shopify.
So, the memo seemed to baby-sit adult engineers. It goes without saying that engineers will use AI as they see fit, and the least a company would do is to make copilot subscriptions available for the staff if needed.
That is _reflective_ practice (which involves reflection). Reflexive otoh comes from 'reflex', which does suggest unthinking automaticity.
People for which development is not their job will absolutely want to get rid of it as much as possible because it costs money. I really agree with the author, it does feel like a regression and it’s so easy to overlook what makes the most part of the job when it looks like it can be fully automated. Once you don’t have people who are used to do what’s quoted, and there is 500 million lines of code and bugs, good luck with that to ask a human to take a look. Maybe AI will be powerful enough to help debugging but it’s a dangerous endeavor to build critical business around that. If for any reason (political or else) AI got more expensive it could kill businesses (twitter api ?)
Today every type of problem and every type of solution seems to have to be solved with AI, when there are more creative, original and artisanal ways to solve them (even if, sometimes, they need more time and patience)
Also in your analogy the calculator is the compiler :). AI would be someone telling you the numbers to use and you just trust em.
The question in my mind is if you need to become less productive to keep your thinking skills sharp. Do we need to separate the work from the "gym". We have times when we are using AI heavily to be as productive as possible. Then we have other times where we don't use it all to keep us sharp.
Is this necessary or are we being old fashion? I lean more towards this being necessary but if I grew up with AI, I might look at not using it as trying to write a web app in assembly. Yes, I learned it in college but there no reason to keep using it.
Coding will be exactly the same soon.
What if every time you had an Aha! moment, you blogged about it in detail. Many people do. AI ingests those blog posts. It uses what they say when writing new code, or assessing existing code. It does use hard-won knowledge; it just wasn't hard-won by AI itself.
The current crop of LLMs has a lot of knowledge, but severely lacks on the "intelligence" part. Sure, it can "guess" how to write a unit test consistent with your codebase (based on examples), but for those 1% where you need to make a deviation from the rule, it's completely clueless how to do it properly. Guessing is not intelligence, although it might appear masked as such.
Don't get me wrong, the "guessing" part is sometimes uncannily good, but it just can't replace real reasoning.
I mean, what can anyone do, anyway? We’ve been on a "quest" toward the total automation of work for decades! and unfortunately these reflections are coming far too late.
didn’t anyone notice what was happening all these years?
Talking with a musician friend, he pointed out that today, studying, producing, and releasing music is almost volunteer work because the vast majority of artists will likely see no return on their investment, especially with AI flooding the music platforms, so I really expect it to happen to many other jobs.
I wonder if music is the best example, because if I recall it has been always like this for musicians. Never have I heard that in my, my parents or grandparents time Musician was a career you would get in for money
Not a single book on the NYT bestseller list is written by AI.
At the same time, as AI takes over the actual coding practice more and more, I find the situation with multiple programming languages a waste of resources.
If AI could generate binaries, web assembly directly, or even some "AI specific bytecode" then we could skip the steps in the middle and save a ton of energy.
Generating boilerplate code - getting frustrated about code is what drives new ideas and improvements, I don't want to lose that friction.
Summarizing documentation - Reading and making sense of written material is a skill.
Explaining complex concepts - I don't want explanations on a silver plate, I want to figure things out. Who knows what great ideas I'll run into on the way there.
Helping debug tricky error messages - Again, a skill I like to keep sharp.
Drafting unit tests - No one knows better than me what needs testing in my code, this sounds like the kind of unit tests no one wants to maintain.
Formatting data - Maybe, or maybe whip out Perl and refresh that skill instead.
Keep delegating everything to AI for a year and I suspect you'll be completely worthless as a developer without it...
"I noticed the following facts about people who work with the door open or the door closed. I notice that if you have the door to your office closed, you get more work done today and tomorrow, and you are more productive than most. But 10 years later somehow you don't know quite know what problems are worth working on; all the hard work you do is sort of tangential in importance. He who works with the door open gets all kinds of interruptions, but he also occasionally gets clues as to what the world is and what might be important."
Each of those little interruptions is a clue about the wider state of the world (codebase / APIs etc). AI offers a shortcut, but it does not provide the same mental world-building.
But with DE, you need maybe 80%, and the 20% you build with workarounds is constantly under threat. Why, because you're effectively enclosed in a small space by the design decisions of the DE.
(That said your point is valid — there is boilerplate that is tedious and the resulting pain will be motivation to improve things)
First, the pervasive assumption that there is no skill involved in food preparation is wrong. While the floor may be higher in a kitchen operated by an executive chef, there is a noticeable difference between a badly-made Big Mac and a well-made one. Execution matters.
Next, at this point "IT" is so broad as to be almost meaningless. In this discussion, we're talking about programming.
Finally, you're holding up Michelin starred chefs as being inherently better than all other chefs. The Michelin star program is skewed towards one particular end result; to put it in technology terms, it's like grading your business solely on a narrow set of SLOs rather than a holistic understanding.
AI is liberating them because it automatise 80% of their work, and there is nothing wrong about that. Most people work on projects that won't even exist in 10 years, let's stop pretending we're all working on Apollo tier software... Coding isn't a craft, it's not an art, it's a job in which you spend the vast majority of your time fucking up your eyes and spine to piss code for companies treating you like cattle.
For every """code artisan""" you have a thousand people who'd be as excited about working in a car factory or flipping burgers, it just so happens that tech working conditions are better
I worked fast food in high school in the 00s, like many folks here, I bet.
The line and product at a fast-food restaurant is heavily optimized. The patties used by a typical US quick-serve restaurant are designed around processes for optimal cook times per patty.
It really doesn't require much skill to assemble a burger at McDonald's these days outside of minimal training most unskilled people can pick up easily.
I was comparing burger flippers to michelin chefs, not to devs. The vast majority of devs are gluing tools together and working on basic CRUD stuff, which is the burger flipping of the tech world. It's just a job, people don't want to think about code in the shower, on walks, or "cry" about tech problems as the author seems to romanticise. A job is here to provide money so you can live life, not the other way around. If I can automate my burger flipping to go to the gym or read a book instead I'll gladly do it