I believe the anti-college/anti-credentialing stance is primarily how people are sold on degrees. People are told they will get better jobs and make more money if they get one. However, no one really takes the time to tell them that just getting your degree does not qualify you to immediately go get a job doing exactly what you want. Essentially, people have been oversold on what the degree actually gives them.
My belief is that a vast majority of the technology related jobs are the modern day equivalent of 'blue collar' work, more akin to tradesmen, such as electricians. If you read up the requirements for an electrician in the U.S. (a brief read of http://en.wikipedia.org/wiki/Electrician#United_States is good enough), it follows the same general trend we see with software development experience, even if the lines aren't as clearly drawn.
One caveat, I don't have a degree. I'm pretty close if I were to want to go back and finish it, but I didn't drop out by choice. I'm of the firm belief that it is possible, but significantly more difficult, to gain the same knowledge outside of a college or university, it's just harder to quantify the knowledge you have. I would also suggest everyone go to college and get an undergraduate degree in something that interests them, but focusing on core classes and general requirements, not their particular interest. Universities are great for two things, imparting a general base of knowledge and specialized knowledge. However, undergraduate degrees have shifted to focus on specialized knowledge at the expense of general knowledge.
One last note in my long-winded comment, I really appreciate how you state 'you should go to grad school if you want to get better at CS' rather than the usual defense of schools saying that it's absolutely needed.
The degree certificate itself, perhaps. But I think people also miss what can come along while getting the degree. Networking with senior faculty members. Participating in undergraduate research. Independent classes with those faculty members. Networking with peers of similar interests.
It's what the student makes it. As such, you're quite right that just "getting a degree" doesn't make you qualified for anything aside from being a degree holder. I think it's important to keep this in mind when discussing undergraduate degrees.
That's certainly how most of tech-workers are treated. However - should they? Even for non-degree bearers (or those like me transitioned from other fields), it's a highly creative non-mechanical work. The notion of treating tech workers as blue-collar ones is I think stemming from the black-box sophisticated nature of work (and perhaps introversion of many in the business) which the industry bike-shed into previously known moulds.
When you see someone who has three degrees in various flavors of computer science (BS, MS, PhD) who doesn't know about what happens when you use "==" with two floats, it builds.
When your boss tells you that he won't pay you any more because you don't have a degree, it builds. When you find out he's paying the three network engineers 3x what you get and they don't even really understand TCP/IP (so they wind up coming to you for everything), it builds.
When you see someone who is hired as a sysadmin because they are clearly such a great computer scientist and can do anything, and then who can't manage the simplest of Unix maintenance requests, it builds.
Any time someone tries to use their degree as a club instead of a wall covering, it builds.
That's just me. I can't say why other people feel that way.
For the record, I only got my degree two years ago this month. I mostly did it because I really needed to "walk the stage" at last. Everything else was secondary. Now that I have it, does it make me any better than others? No way. If anything, it puts me behind the 8-ball having to pay off these stupid loans for the rest of my life.
It's just a convenient excuse. If it wasn't a degree it'd be something else. The boss was negotiating, so negotiation skills are what's needed in this situation, rather than a degree.
Your comment, while true, is fairly irrelevant in context of this article.
Not that degrees matter anymore. They do not. Experience does.
I think a big part of the problem is the irrationality of getting a degree in 'Computer Science' to be a programmer (aka 'Software Engineer'). My degree is in Electrical Engineering with a Computer Engineering focus and I would never have gone for an Electrical Physics degree. I wanted to build things, not be a research scientist.
So, why is the de facto degree for programmers Computer Science? How much do you actually learn about building software in a CS program? My experience indicates, not much. You learn a lot of computational science, great, most products have 1 or 2 difficult algorithms at their core, but they often represent an amazingly small % of overall development time, where is the training for everything else that needs to be done?
The biggest problem is that there seem to be no good Software Engineering programs. Software Engineering degrees do exist but they seem to be woefully outdated and only address a small subset of those people whom wish to 'build things' with software.
Of course, this doesn't apply to everyone. A huge number of people skip computer science degrees and are still highly successful.
But one swallow does not a summer make, which seems to be the approach of many of these articles: Hey, look at me, I didn't go to college and I'm successful!
This seems to be an argument that is very difficult for either side to argue, because I don't see a way you could really look at a statistical analysis of the careers of CS grads versus non grads and make some kind of conclusion.
Speaking as an anti-college, anti-credentialist person, well, a lot of it, for me? is that I didn't go to college.
I mean, everyone likes to pretend to be altruistic, but in the end? we all see the world from our own perspective.
And really? most people seem to think that you go to school, you get a cs degree and you will be pretty good. And that's simply not true. I've hired people with CS degrees only to have to fire them, because it turns out, they didn't actually learn anything.
I mean, I've also worked with people who really did learn incredible things in school. Things, as you said, that I certainly have been unable to teach myself.
I also think a lot of this anti-college stuff is people who went in to debt for life, having been told they'd get useful job training and, you know "find yourself" (you know, the class thing. No matter how much money I earn? I'll never be middle class; in the eyes of most people, i'll never be a 'professional' a 'real person' unless I go get a degree. I'm the guy who fixes the pipes. Which, eh, I am mostly okay with at this point.) - but anyhow, yeah, these kids graduate and find that the market value of their degree is, well, pretty close to zero. (and yes, a lot of these kids got degrees in philosophy or fine arts... And yes, we laugh at them for expecting to turn those degrees into money. But they were 17, goddammit, when they made that decision. What kind of asshole expects a 17 year old to make good decisions, especially when they receive bad advice from every trusted figure they talk with? Then we go and say "unlike every other bad loan you will take in your life, the person giving you the loan takes no responsibility. You have to pay this one off. No bankruptcy.")
This idea that "you can do anything you set your mind to" also permeates education, and it's incredibly destructive; it's how we end up with those people with CS degrees who can't outcode me.
So if you don't go to college to learn to code, why go at all? You go to learn the fundamentals of computer science and math which underlie everything that we do.
It's certainly possible to obtain this knowledge on your own with serious self-study. But it's a hard route (albeit one getting easier thanks to Coursera et al.) and given two candidates with equal coding ability, the one who went to college is far more likely to have a strong grip on theory.
Is theory actually useful? I would claim that it is, even for programmers doing CRUD apps and the like. Knowing the fundamental abstractions of computer science and the ways people have applied them in the past saves an enormous amount of needless reinvention. As a concrete example, parsing is a very well-studied problem in academia. But somebody without that background trying to write a parser will struggle far more than another familiar with CFGs or PEGs.
Being able to program is just scratching the surface of computer science. It's necessary but not sufficient for being a great programmer.
As recently as ten years ago, it was impossible to learn to do outside of a formal training program, but today it's completely possible to learn to do almost anything via Youtube, iTunes, and the rest of the internet. And the problem is that today's university system--except in fairly rare cases--actually hinders learning to do. The world outside moves too fast for the educational bureaucracy to keep up. Formal education is valuable, but maybe more for learning to think than learning to do, at least in modern times.
I believe he is advocating a dual system of education with a combination of college or apprenticeship since there are different types of learners out there. College is the 'only way out' for many people, otherwise you would be considered a failure and cannot find a job.
If the "dual" system were to include architecture, algorithms, discrete math, and deeper subjects that a typical CS curriculum usually covers, what's the point of an apprenticeship system if it is replicating academia?
I can see a need for a trades like class, but here in the US we already have an assortment of community colleges, certificate programs and the like that replicate that model.
And saying that personal experiences in college was important to oneself, so everybody should try is the same that saying that my personal exeperiences in playing competitive soccer were important to me, so everybody should try.
And to say that the learner does not get to decide what they will research for their dissertation is just flat-out wrong. It is assumed that the student will pick what to write their dissertation on - they need to study it in-depth for 2-4 years and picking something that they do not find interesting will most likely lead to dissatisfaction (but it is still a choice).
Granted, a PhD program requires a prerequisite BS degree or equivalent which is generally not paid for and follows a typical regime. However, not all knowledge is the same - highly specialized "brink of human understanding" type learning is paid, unstructured and definitely has no exercises to complete (since the one learning them has the most understanding of the topic).
More importantly to your question, teaching a hard curriculum to a lot of students at the same time is a hard problem, and exercises with deadlines and relatively tight syllabus are a method that has been shown, for many years, to work for a large number of people. It certainly doesn't work for everyone, it certainly isn't the best way to learn, but it's an effective tool for the problem at hand.
Are you saying that training on the job with a good mentor, with classes once a week, does not qualify as "education"?
There are plenty of engineers who love to pour over academic research of the bleeding edge who have no problems to solve with it, yet when you ask them to hack together a prototype of an idea they don't know where to start. I think there is a clear personality type that prefers to continue reading before doing, since there is an ever-present fear that just around the corner there is some kernel of knowledge that will remove the need for a massive body of work. In reality, this is almost never the case, and elbow grease and experimentation turns out to be the solution to many problems, not digging through libraries, frameworks, algorithms, and academic research.
Get a 4 year college degree, do Qt for a living.
There is large category of jobs that don't require complex math or a good understanding of algorithms, yet we treat them like they are 4 year degrees. This is bad, because it wastes a lot of time and causes a lot of pain.
And some of the more advanced concepts that you mentioned are not beneficial to know for majority of things you will do in day-to-day work, although I bet you can still learn a lot of it for free with online e-books.
If you want to do research on some really advanced topics then a degree is a must, but an average (high-)school education is a complete waste of time.
It's fine to say that you're getting by just fine with stuff you learned on the internet, but to call education a waste is to declare that what we have now to be sufficient and there's no point in learning anything new. That might be working for the Amish, but it doesn't work for me.
Even then it is not actually that hard to learn about principal components or matrix regularization if you have some mathematical background and the desire to teach yourself...
Self education may appeal to many of us here, but it applies to less than it appeals to.
That's missing the major point of the piece, IMHO. The OP isn't describing "self education", and (coming from a US perspective) is not even a typical drop out.
What's being described here is a vocational pattern of learning, where on-the-job experience is backed up with classroom instruction. That's not exactly what "drop out" or "self educated" imply in the US.
I think a key ingredient is that you have to love to learn, that you would do it regardless of circumstance and that learning itself is your passion.
If that's the case you can safely drop out of education, but if the motivations are to some extent external then better stay put.
I agree with you in that I don't know we have the data to argue that apprenticeships are better, but I don't think we have data to argue that they're worse either, and I can't see how having it as an option is bad at all.
I think the whole point is that, statistically speaking, many many more people go through the system and succeed than otherwise. That's why it's the system.
(I'm not saying that's still true.)
There is value in both, and the confirmation bias is apparent but it may not have been the intention of the author.
And what of the people who graduate, get a job, and blog about it?
I doubt Tobi ever had reading comprehension problems and I bet he was passionate about his hobby leading up to his track switch.
As for me, I'm a high school drop out and everybody told me "You'll end up stacking shelves at 7-11!" because that's what dropouts do. They made the same mistake you are accusing us of making, when we talk about our own pasts. I knew several outstanding HS and college dropouts among my online friends at the time and they inspired me to stop wasting my life. I'm so glad they did.
I am 15, and I am planning to drop out of school.
It is extremely frustrating trying to persuade my parents that dropping out of school doesn't mean I am giving up on life, and that I am not a failure.
I cannot function in this artificial society that exists in schools, and I learn much better by myself. Finishing school would be complete waste of my time, for a stupid piece of paper. During this day and age, it is extremely easy to attain knowledge for free without attending any institutions. Currently, I have several dozen textbooks on my kindle, and I am taking 3 classes at Coursera.
My parents are trying to compare me to all the stupid kids that usually drop out, and I cannot comprehend their close-mindedness and ignorance.
Because of their stupidity and effort to "help" me, I may even end up homeless by the end of this year.
The best way to get better at being a career programmer is to be a career programmer. This is understandable! And the author's initiative at 16 years of age to identify his passions and stick with them at the exclusion of higher education is commendable.
But I've learned a whole, whole lot in college -- and yes, I completely agree with the author that the amount of time I spent tinkering around with computers is smaller than it otherwise would. Instead, I spent time joining a fraternity; taking classes comparing Milton to Bradbury; learning how to pitch a stock; volunteering with the homeless; getting blackout drunk on a Tuesday night; learning the differences between Brahman and Brahma.
Do those things make me a better programmer? No, but I absolutely think they make me a better person than had I spent the past four years as an apprentice. (And, I'd wager that they're better for my career in the long term, but that's not really relevant to this discussion.)
So, I guess, my takeaway: if you know with absolute certainty that computer programming is your sole passion in life, then college is probably not your best choice. (I'm not saying that the author is being anti-college in this post: but I fear most of the readers might interpret his post as such.)
I wholeheartedly agree on all the other things you list about college; doing an apprenticeship doesn't give you any of that. I'm confident that going to college was one of the best decisions of my life, maybe the best decision, given all the contacts / friends I made, the things I exprienced, the new topics I learned about, the discussions I had, the parties I had, etc.
All that matters in the end is that you like what you do. And if you don't, you can always quit. You might not always know what you want, but you'll figure it out.
I'm all for enrichment, and I did enrich myself in college, but no more than I do on my own now that I'm in the real world.
(Edit: Also, I didn't learn how to really party until after college =)
How so?
---you could probably pay less. Programmers' opportunities for meaningful, understood-by-the-recipient service are few and far between. I don't think it gets any more meaningful than this. One of my favorite gigs ever was tutoring Python for Tutorspree, and watching my students progress. ---It has a higher meaningful skill ceiling. The problem with "architects" is that every programmer is an architect, or should be. The title creates the idea of "one smart architect, many stupid implementors." But the 10x phenomenon means this is the last thing you want: you want a few smart developers, not a lot of dumb ones. So why waste your talent managing labor, when the labor is useless and what you need is more talent?
His critic, the famous sociologist W. E. B. Du Bois, had the following reaction after a trip to Germany in the 1930s:
The Siemens AG factory, in Berlin-Siemensstadt,
particularly excited him, with a training-and-
apprenticeship system that he believed could
provide the model for American Negro industrial
education.[1]
(Yes, that's the same Siemens that Tobias refers to in his post.)More recently scholars like Katherine Newman and others have looked at such programs as a model policy not for racial uplift but to boost the US's shrinking middle class.[2][3]
With college debt now at $1 trillion and rising, it definitely is not a bad idea to explore alternatives to the way education works in the US.
[1]: http://chronicle.com/article/WEB-Du-Bois-in-Nazi/1896/
[2]: http://books.google.com/books?id=1jfAhghdH7MC
[3]: http://dx.doi.org/10.1007/978-94-007-2272-9_10In short, a traditional Computer Science curriculum is not designed or intended to churn out e.g. Rails developers who can be immediately productive. An apprentice-style training system could do that, however at the expense of not teaching some of the more theoretical computer science (or teaching it in a much more applied style).
Commit big changes to open source projects. I bet you get a wide variety of reactions.
However, oftentimes these jobs are mostly catering to blue-collar workers. If, instead, you want an apprenticeship as a programmer or something other more white collar, then - in a small village - chances are dim, too. In my case I had to move to a different city, so I could find one. Which is something many of my friends did, in order to find a suitable apprenticeship: Move away.
And by the way, I do have a degree.
In order for something like this to extend to programming, you'd need someone to establish such a guild structure and trade training capacity.
Whether such a thing ever occurs is doubtful in my mind due to the established interests of those who are already credentialed and in the workplace. Much as becoming a lawyer was once a matter of simply passing the bar (no Juris Doctorate was required) but now has erected barriers built by the ABA to protect its membership, I suspect that there would be resistance by established tech workers and managers. (as an amusing aside, the state of Washington actually still has an apprenticeship program for lawyers but due to the ABA those apprentices can only practice in the state of Washington and will not receive reciprocity with regards to taking and passing the bar in any other state.)
Half my life I've been fascinated with computers, for a quarter of it, I've been actively building programs to populate them, replacing the typical childhood experience with code and logic.
When I got my first job in programming I had been a hobbyist for a few years and I was a fresh mind to mold, but I felt used, I had a house and food to pay for and it was a hard experience.
It's more than a year ago now and I'm at another company, one which valued my skills in a better way, more money, better prospects and far better training, I feel like I learn more every week at my current job than my entire time at my last job.
To anyone out there who is young and jobless, or unsure of what they want to do, or sick of stacking shelves at Asda, I implore you, try one out, the job satisfaction and the change in your mentality will be more than worth the effort.
You don't need to program from your youth like me, most people start between the ages of 16 and 21 (so far as I can tell) and codecademy and other services are making it a lot simpler to learn these days - anyone else remember learning HTML from W3SCHOOLS?
I said this to my best friend a few months ago, when he had just had his baby and had no job: Why do nothing, when you can do something you will love?
At the time there was a true disconnect between the real industry and the schooling part of things, mainly due to the fact that unlike baker of carpenter, front-end programming was such a new thing, so it got thrown into the same basket as print publishing, and thus I had to spend my once-weekly days at school memorizing various paper-fold patterns and Colorspace and DPI settings which were highly irrelevant to what I was doing throughout my days building websites at a digital media agency.
Like the OP, I got taken under the wing by a guy called Thomas, who was a pretty decent coder, and to whom I owe a lot.
Germany wasn't doing so well economically at the time, so shortly after I migrated to the UK. My three years of industry-relevant experience gave me a great base to build my career upon, for which I am thankful to this day.
note that he moved to the USA where the stigma of going to the wrong sort of school is less of a problem.
And one of the best reasons to go to college for computer science is making friends who share your interests, those are the people you will network with to find great jobs, start new companies, etc. So while the classroom environment might not be thrilling, there is plenty of education to be had outside the classroom. If you are super enthusiastic, you might even get to work on cutting edge research as an undergrad in a university lab.
Don't get me wrong, I don't think it would have made much of a difference in where I'm at in my career, but I wonder about the rest of me. The people I know who went to college tended to have social experiences that I didn't have (though perhaps working at a startup these days offers some of that). They took classes in subjects that I have only a basic understanding of (physics, chemistry, politics). In some sense of the phrase, they're more "well rounded".
I'm not certain how much that matters. Maybe it doesn't matter at all.
For over a year now I didn't see any lectures, but have been working at a company instead. I had all my exams, the only thing I have left to do is my thesis. It's hard motivating yourself to do scientific work if you know you can get by just fine without it. And all just for a degree that doesn't matter (to me at least).
What you're doing would be a very unlikely scenario in systems where studying is setting you back a couple of thousands every term, i.e. Australia, UK or the US.
He probably left a Gymnasium after tenth grade without getting an Abitur or didn't go to a Gymnasium after finishing Haupt- or Realschule. Unlike dropping out of high school in the US, this would still leave him with a diploma which would make getting an apprenticeship as a Fachinformatiker plausible, which is practically impossible to get without some kind of diploma.
btw. you can't just drop out of school as a minor in Germany as not attending school regularly would be illegal.
His learning "disability" is all but universal: teaching style doesn't give students a real problem to solve. That's how science is supposed to be taught. Ask your students: can you build an engine out of a pot of water, a tight-fitting rubber cap that expands, and a small weight that can be attached and detached from a crank at any point in it's motion?
This is how you introduce temperature, heat, pressure, volume, work, etc. And I can't help but think that an apprenticeship basically achieves the same thing: it motivates the learner with real problems that motivate them to find a solution.
This is great in theory, but in practice CS doesn't really answer the problems you have as an application programmer. Much of CS "book learning" is focused on things like compilers, systems programming, or data-structures and algorithms. I don't know about you guys, but I spend very little of my time designing languages, compilers, or systems programming. I don't even spend that much time choosing data-structures. No, I spend most of my time learning the interfaces that other people have built around these things - learning how to parameterize and operate within someone else's higher-level environment (which is also known as a platform).
So what should ordinarily be an exercise in data structures and algorithms becomes a piece of techno-social detective work: nothing less than the determination which library/tool/framework is appropriate for the project, and then to get busy learning that combination to be able to release real software. If there exists, anywhere in the world, a class on how to discover, evaluate, and integrate libraries into an existing project, I don't know of it. In the same way, I doubt there exists a class on comparative application architectures.