Here's the thing about methodologies: anyone can use them. Any methodology that is popular enough for you to have heard about it was developed by one or more smart people and solved some real problem. Between now and then though, countless cargo-culters have jumped on the bandwagon and misapplied the methodology to the wrong problem in innumerable ways. Maybe they implemented it wrong, maybe they didn't have applicable problems, maybe they are as dumb as a box of rocks but have carved out a comfortable niche in a fat corporation somewhere. The point is, a methodology is meaningless out of context.
Hell, all ideas are meaningless out of context. In every day human life we tend to share a lot of context with those around us, in terms of software development—ie. the stuff of pure thought where the only limit is logic itself—we share considerably less so. When you're writing a web app you are living in a wholly different world from someone writing a Mars lander ROM, yet we call both of those things "software engineering". If you want to have good ideas and be competent you have to apply whatever ideas to the context you are in. Being smart is not about having the best ideas, but about evaluating how ideas apply to and interact with systems. The hardest thing about entrepreneurship isn't figuring out how to write software to do X, it's how to choose X such that a cascading chain of seemingly random events translate into market traction.
The horrifying thing about software development is the limited intelligence of the human brain to begin with. From a certain perspective we are hopelessly stupid and incapable of truly elegant software design, but on the other hand we are the only entities we know of with the capacity to write software at all! All this is to say you need to embrace ignorance and subjectivity, and simply pledge yourself to continual improvement. You do this not by latching onto ideas and filing them into good or bad buckets, but rather by processing a lot of them, applying them in practice, and seeing how those with more experience than you do the same.
If your curiosity outweighs your frustration over a period of decades, eventually I guarantee you will be at least a competent software developer, and maybe by then ageism in a tech will be something the kids laugh at like televisions with twisty knobs.
Having a desk job helps in understanding it is more important that you can understand your code 3 years down the road than squeeze the last 5% of productivity out right now.
I think I don't understand this readability thing. I have trouble keeping the structure of large programs in my head. So for me, the number one readability property of a program is it's length. The shorter, the quicker I can read it.
For other people it seems to be a lot more about code they don't understand. List and dict comprehensions are one such thing, especially the iterator-based ones. Using variable capture with in-line functions is another, and God forbid I pass around a function pointer to an inner function, because it often lets me avoid writing an entire pointless class that I have to remember exists. I use functional programming constructs constantly where they make sense, which tends to be a lot of places. Why is all this forbidden ? Is this because people don't know what they do/how powerful these constructs can be ? I feel that rewriting a nested list comprehension into a 20 line function does absolutely nothing to clarify matters for me, it makes things worse.
This looks to be the crux of your argument and it's a weak one. Just because there are wide ranging opinions on definitions does not mean the ideas have no merit. The idea of "education" means something very different to many people. That doesn't mean that public school, college, or Cisco certification training is "pointless". There are thousands of ideas/words that don't meet 100% agreement. Just the other day, there was a thread on HN about what "mathematical proof" meant. Does that mean someone can legitimately dismiss math proofs as pointless because mathematicians disagree?
Alan Kay's idea of "object-oriented" is different from Bjarne Stroustrup's. Martin Odersky's idea of "functional language" is different from John McCarthy's. Regardless of the differences, there are still good ideas in both OOP and functional paradigms that can improve designs of software architectures.
>I don’t know if I should “love to hear your argument against this” but since this is a blog post from some obscure programmer on the internet, then I’m sure you’ll write it anyways.
I don't understand the conceit you wrote here. If this is how you truly felt, why did you post your blog to HN? I thought the idea was to invite commentary.
Obviously so we could read something he thinks is a discovery.
On the rest of what you've said… Funny that however I obviously agree with the whole point of your post, I disagree with almost every single argument you made.
"Pointless" is pretty much about being opposite to "something that works". If it does work it's not pointless. If it doesn't — it very much might be. So everything depends on definition of "working", which is different in different situations, depending on what result we want to get from our activities.
Consequently, all of your "education-based" examples are bad, as somebody might claim Cisco certification, school and/or college are pointless and even be completely right. In the most philosophical sense nothing is really pointless of course, but if you are coming to me asking for advice you probably don't want the "most philosophical answer" and depending on your goals and personality (even in very broad sense) I could declare many popular things people are paying for as "pointless" for you. Because they "don't work", or the costs/profit ratio is too high compared to other options.
Your "math" example is also quite unfortunate (however I must warn you I haven't read the article you are referring to), as if there actually was disagreement on if something is or isn't a proof — it really could and should be "legitimately dismissed". It might not be actually wrong, but you either accept something as proof, or you don't.
Now, it's true that there isn't one good definition on what proof actually is. Actually, there are doubts that that definition is even possible. And it's nothing new, it's almost 100 years now that this is a problem for mathematical society.
So how my last two points go together? Perfectly fine, actually, because depending on your views on the problem of defining proof something can or can not be proof, and these views can differ drastically depending on whether you are a member of Bourbaki group or a layman. For example, Alexander Grothendieck dismissed (quite "legitimately", by the way) the famous proof of "4 color theorem" and he is completely fine mathematician, and that proof is something that you will probably claim to be proof.
The similar problems I see in your "different languages" example, but it would take a bit longer to explain, so whatever.
Methodologies and even many of quite arguable "best practices" aren't pointless, because they do work, not because "disagreement doesn't matter". Actually, I would claim that disagreement does matter, because there's a good chance that one side will be proven to be "more right" than the other over time. Yet even the "less right" one is often much better than nothing, especially because the one "absolutely right" opinion doesn't exist anyway.
Then I may have misunderstood the discussion functionality of HN. I've never submitted a story so excuse my ignorance for asking: when you submit an article, is there an option to disable all followup comments so that the post is "read-only"? (Which means the OP forgot to check that option.) If that functionality doesn't exist, I don't understand how anyone would post to HN but not want commentary.
>Consequently, all of your "education-based" examples are bad, as somebody might claim Cisco certification, school and/or college are pointless and even be completely right.
I agree with that but that's not my point at all. You're talking about a case-by-case situation of delivering education (some of it good, some of it bad) and the subsequent extraction of economic value from time invested (repay school loans, opportunity cost, money wasted.) Maybe someone should drop out of school because he has great card playing skills and wants to bet his income potential on winning World Series Poker championship. Or maybe he's an autodidact and can learn iOS and Android on his own to a $100k salary. Scenarios like that are not relevant to my point.
What I was talking about was something else. It was the idea and concept of education. My point is that just because teachers, politicians, and parents disagree on what "education" is does not mean the idea of education is pointless.
>Now, it's true that there isn't one good definition on what proof actually is.
Right. And yet, we as a civilization can still benefit from "math proofs" in spite of the fact that there are some philosophical differences on what "proof" is.
>arguable "best practices" aren't pointless, because they do work, not because "disagreement doesn't matter".
I wasn't stating this. I said something different: that "best practices" are not disqualified because of disagreements. I did not say "best practices" are good because of disagreements. Those are 2 very different sentences and I did not write what you think I wrote.
Instead of trusting everything you read, you should question it an consider it in practice before you actually follow it.
> Instead of trusting everything you read, you should question it an consider it in practice before you actually follow it.
But once you've questioned and evaluated it, its still a methodology. If you just mean you shouldn't take published methodologies on faith but consider how they work in your particular environment, that's widely accepted truth. Its, among other things, a central idea of the Agile software movement, as well as central to Lean methods (not just in software, either) and, more generally, to all models based around a Plan-Do-Check-Act cycle and variants thereof.
Its hardly a controversial or new observation, though there are plenty of failures to put it into effect.
I already got that. It's already in the title of your post. You're just restating the title again.
But in the body of the text, you gave your reasoning and arguments attempting to explain why it was pointless. I'm pointing out that your reasoning (e.g. "different definitions") does not support your title. You need to construct a better set of arguments to convince people of your thesis.
I like to call this the 'fallacy of assumed competence'. We assume teachers are good at teaching, and teachers assume students are good at learning and understanding. As i've grown older i've come to believe that most teachers suck really hard at teaching, and that most students suck really hard at knowing how to ask the teacher what they need to know.
In terms of programmer methodologies: nobody in the real world cares all that much. Nobody quits their job because their bosses wouldn't enforce using their preferred methodology. If you're all tasked with developing some giant framework or application and you all have to work together, at some point you learn it's a lot less painful to put your ego aside and just get work done.
ಠ_ಠ
Make my words, when you get down to brass stacks it doesn't take rocket appliances to get two birds stoned at once. It's clear who makes the pants in this relationship, and sometimes you just have to swallow your prize and accept the facts. You might have to come to this conclusion through denial and error but I swear on my mother's mating name that when you put the petal to the medal you will pass with flying carpets like it’s a peach of cake.