http://www.joelonsoftware.com/articles/ThePerilsofJavaSchool...
It's not to teach programming. It's to teach computing theory, algorithm design, etc. That's the reason that the top-level university programs are so heavy on theory. That just happens to fit with systems programmers(who somehow aren't software engineers... curious). It also fits with game programmers, which is where I think a lot of CS students want to go after they graduate.
I also disagree that there is any real good way to teach software engineering at this point in time. Unlike other disciplines such as Mechanical or even Electrical engineering with decades, even centuries of practical knowledge that has resulted in a lot of very strict best practices. Software engineering doesn't have that. Everything in the domain is extraordinarily fluid. Processes are changing rapidly. Tools as well. Even the end product we produce is vastly different from what it was 10 years ago. There simply isn't any good way to teach this subject in this environment short of on the job training. It's compounded with the fact that every software development house is different. Nobody has the exact same process, or uses the exact same tools. So teaching SE as a discipline really doesn't make much sense.
i understand quite well what a CS program is, since (as i pointed out in the post) i've been a student in three university CS programs and taught coding at a fourth.
i'm saying that most of what goes on in academic CS departments is not what we need as a society and it's not what most undergrads hope to get into when they enroll. most undergrads get into CS to write software, not to learn "theory" or "compiler design" (although, again, as i pointed out in the post, if this happens to be your bag, more power to you -- but this isn't really about you, it's about the mismatch between what university CS programs do and what people and society really want or need.)
software engineering is obviously teachable (i've been doing it for 20 years) and that's true even if you personally didn't have any good software engineering instructors or mentors. it looks like you and i are in agreement that rapid change in software engineering makes universities not the optimal place to teach software engineering (that was the point of my post). but it doesn't follow that because universities are bad at this, that software engineering is "not teachable".
I do thing cube123 read your post, and I do think a valid conclusion from reading your post is that you have missed the point of what a CS program is.
I suspect that this comes from your own history, and part from the way in which CS 'grew up' as a discipline.
The point of your posting is that being employed as a programmer, and learning programming is not what a Computer Science degree is about. Try getting an undergraduate degree in Electrical Engineering, its similar in that you have lots of theory on lots of things but little practical experience.
The leap you make which causes me to discard your argument is this one, "but this isn't really about you, it's about the mismatch between what university CS programs do and what people and society really want or need." You have taken on the role of speaking for society and yet you haven't successfully made a case that you can accurately represent what society wants.
You can make an argument that there is a need for a training program between high school and employment that teaches people how to write programs to solve problems. You can call that Computer Engineering, Applied Programming, Programming Technology, what ever. Such programs exist, both in the 'for profit' University world and elsewhere. You can argue that such programs should be structured more along the various processes for producing reliable, testable code, and you do some of that in your education project (always great to put your money where your mouth is like that).
But the opportunity to teach folks who 'just want to code' does not disqualify CS as being a valid course of study, just like 'Accounting' doesn't disqualify 'Mathematics' as a course of study. So your central thesis that 'more universities should shut down their CS programs' fails the 'sniff' test.
Now if you said 'More universities should offer applied programming type degrees' and used your points about how it is what many people want to do. That is a reasonable conversation to have, do we want to elevate what have been things like ITT Technical College programs into a more general purpose degree program? Something between 'JavaSchool' and 'CS' ? I can see arguments for and against.
But if you are going to blurt out things like "Most undergraduates and professional actually want to learn applied software engineering, not 'computer science'" you really should try to develop some foundation for that claim. What evidence do you offer that this claim is valid? Some study on college exit exams, some survey of recent CS graduates? A self selecting poll on Reddit? Its all well and good to wonder if most folks just want to code, but to use it as a claim in your argument that Universities should restructure their CS programs, and expecting your readers to 'buy in' to that requires that you provide some basis for making that claim.
Your post makes a bunch of claims, four of them in big bold font, for which your provide no supporting evidence or structure at all around why the reader should believe them. Because of that your message is lost.
I agree with you that it is an interesting topic and as we've moved 'programming' into a more general skill requirement based on the explosion of 'programmable' devices, you might be more successful making the argument that we need to offer a better high school programming class. (much like Typing was offered in the 70's as a way to provide a generally useful skill to High School students).
The flip side is also that an undergraduate enrolls into a CS class, discovers (so it's not their bag before enrollment) that "compiler design" is far more exciting and intelectually challenging than developing typical business/CRUD apps, only to find out after graduation that such jobs barely exist. Yes, let's dispense with academic CS departments.
Maybe the message you tried to convey is that attending mediocre CS courses is much worse than attending good SE courses. That I agree with, but I do not agree that SE should be totally devoid of theory.
I don't think that catering to what undergrads want is a good thing, at all. The majority of undergrads are 17 to 21 year old kids that really don't know what they want to do. They don't know the industries they want to get into. I certainly didn't, and my experiences at UIUC turned me away from going into game programming to a totally different space. And I wasn't a good programmer out of college. I was pretty terrible. But work has taught me a LOT, most of which I wouldn't understand without the theory basis.
>software engineering is obviously teachable (i've been doing it for 20 years) and that's true even if you personally didn't have any good software engineering instructors or mentors. it looks like you and i are in agreement that rapid change in software engineering makes universities not the optimal place to teach software engineering (that was the point of my post). but it doesn't follow that because universities are bad at this, that software engineering is "not teachable".
Like I said in my post, the difficulty of teaching programming and SE in general is that it's changing at an incredibly rapid pace. Languages have evolved. C/C++ were the de-facto languages for a while, then Java gained quite bit of popularity. Recently, C# seems to have taken a large majority of the mindshare.
20 years ago, a lot of SE practices were highly structured, highly documented(and highly wasting of time) systems. IBM's RUP is one example. 10-15 years ago, less structured systems like XP started showing up, and gained quite a bit of popularity. In the last few years, Agile has become very popular. I learned about RUP and XP in my SE courses, but the place I work at doesn't use any of the above.
So what would the program teach? Just the most current, up to date stuff? Or would you try to teach a bit of everything?
The problem with teaching just the brand new shiny is that you end up with the Java mills from the 90's. They're not teaching programming. They're teaching Java. And it may be worthless in a year. While that may just be what the student wants, it's not what society needs.
But what really helps me is that 4 years of being exposed to all this crazy theory has affected my brain in a certain way: it has altered my perception of problems.
Before going into CS I was programming by chance: slap some code together (without really understanding the problem domain) and poke it with a stick until it somehow works the you want it to. I think most kids who are just learning to program do that. However, that approach changed radically after I spent some time at the university. In that sense, CS has been really useful, at least in my case. Btw, I'm not saying those skills could not have been obtained elsewhere.
And learning theory really broadens your horizons. Things like computability theory, I have no use for them in my day job, but they're just interesting. Back at the uni, I had a blast writing a turing machine "emulator" and programming it instead of manually writing the assignment on paper :) And yes, you could study them in your spare time, but the truth is, the job eats up so much of your time, and makes you so exhausted that you barely have the resources to follow an online lecture after a long day at work. So why not spend some time learning while you still free from most of the responsibilities of adult life?
These tracks were generally all theoretical (the exceptions being some classes in platforms and devices), even my friends who had a love for software engineering ended up learning the thoughts behind each process rather than blindly learning the process itself in order to master said process that is currently practical.
The way I treated academic learning is very similar to the process described above by 10098. The learning you experience in college should be some form of aggregate information about past ideas/trials and tribulations. This is the process by which I (as a customer) have found most universities treat their undergraduates. Rather than tailoring them to become masters of a specific problem they try (albeit not successfully, and there are a vast number of people in universities today that don't enjoy, want to have, or have this mindset) to build the next generation of people to find flaws in our current society, in hopes that they have motivation to fix it.
That being said you are making the argument that software engineering is a vocational skill. While I'm not denying that the majority of the process that I've found in my limited time in the industry tends to be filled with skills that are akin to any other vocational profession, I don't believe increasing vocational skills help the students. It might in a short term give them a strong basis for obtaining a software related position. It might also give them the ability to perform really well in their jobs at that time being. It does however not enable them to question the process they've learned. It also does not help them find problems within a process.
It should also be noted that most companies aren't looking for a master software engineer. They test for intelligence and social skills rather than how strict the student is on testing practices or which design cycle they prefer. These biases also lead towards limiting employment because they'll end up fighting rather than adapting to the new work environment that is already established by the companies views on the subject.
A lot of top schools are taking interest in teaching up to date languages, maybe not clojure or even javascript but python is the language that I've seen most Georgia Tech students prefer to program in after taking several courses in it.
http://www.cc.gatech.edu/future/undergraduates/threads http://www.cc.gatech.edu/future/undergraduates/bscs/roles/ma...
Of course, you could always take a script kiddie job at a minor software company. Those wouldn't be fulfilling to anyone anyway, and those are the jobs that exhaust you and make you feel like they are "eating up" your time, instead of being a productive use of your time.
Theory for me is more like machine learning(Bayesian theory, probability latent semantic analysis and way more advanced stuff). However with the age of big data upon us, we are going to need this for your average programmer more and more. Stuff like that happens to be what interests me, I also find it the difficult much to detriment of consistently high grades..
I think your what your proposing would lower the barrier entry so much, your average software dev won't be very good, and they will be cheap. Is that what you want for the proffesion?
During the boom cycles of software development, its not neccessary for someone to go back to university for the skills - online is good enough, more specialized, and much more cost-effective.
Why get in crushing debt to become an app developer? Or to program a startup's back-end, or to administer a cloud infrastructure.
But so confidently suggesting that ".. academic CS departments is not what we need as a society" is no different "cargo-cult" thinking than the poster who suggested that many don't take vocational training seriously. You are blogging about and representing a vocational training service, obviously the readers on HN are going to be skeptical.
Choosing the path of academia in education and as a career is a perfectly valid and respectable choice. Suggesting that the works of groups like the ACM, IEEE are not addressing real societal problem is pretty harsh, regardless of how elitist they may or may not be, and regardless of whether those individuals are making a significant contribution to society or not. There's no need to shit on smart and passionate people in specialized / esoteric fields in computer science simply because their work can't be easily app'ified or conjured into an mvp (http://cs.nyu.edu/~jhan/, http://www.dgp.toronto.edu/~ravin/ ...)
I studied computer science in my undergrad. I didn't go to learn how to code nor did I expect my university to teach me how to code, I already knew how to do that. Many of my friends discovered a passion for CS because of their overlapping math/logic/theory breadth courses, not despite it. Assuming the typical CS student is someone who only enrolls to learn things like applied programming grossly oversimplifies and generalizes a person's intentions and motivations for learning, even though I'm sure there are many with that primary motivation.
Online/alternative learning programs, like your company, are a great approach on education which have lived happily in parallel with universities for years. It's not a replacement though, it is an alternative. I don't think the shortcomings in some universities warrant the "academic butchering" as suggested so boldly in the post's title. To suggest that we should start systematically shutting down CS departments simply because their current curriculum doesn't produce good programmers in a way that aligns with your model/curriculum, is a scary and dangerous thought. When you take away university departments, you chop the legs off entire disciplines.
He made some interesting points, but this just sounds crazy.
A couple of useful links:
http://blog.jeffreymcmanus.com.nyud.net/%C2%AD1924/%C2%ADmor...
http://webcache.googleusercontent.com/search?q=cache:http://...
Google cache of the blog post: Also, http://webcache.googleusercontent.com/search?q=cache:http://...
[1] http://codelesson.com/courses/view/introduction-to-web-publi...