Remember, you don't major in Physics to learn which brand of concrete you need to build a bridge. Not teaching a concept because it isn't "used in industry" is a sickening corruption of what academia is supposed to be about.
See my comment here: http://news.ycombinator.com/item?id=1345596
You correctly said that Physics majors do not study which details of bridge building. Indeed this is part of a different major (Civil Engineering, I suppose).
In contrast, Computer Science is a very broad discipline. Large portions of it are science; large portions are engineering (or similar to engineering, if we exclude Computer Engineering per se). A good student, or a student with a good mentor, should be able to choose a more theoretical (scientific) or more practical (engineering) path and work on the curriculum accordingly. I understand that it is not always possible (in some places, for example, students cannot choose classes), but at least a student should be able to choose where to focus.
A small point about "picking languages on their own". I think basics of languages can be taught to freshmen, at least selectively, to help those students who may be lacking in previous exposure to computers in high school. (Depending on the university, this may or may not be an issue.) Teaching basics of C or Java or, I don't know, Python, whatever, this way does not turn a university in a Java school or anything like that. Indeed it may be as useful as remedial classes in calculus.
It's called "Software Engineering" and some schools (like RIT) already have separate degrees for it.
I think the field would be better off if it recontextualized itself as a form of applied mathematics, with computational complexity at its core. Mathematicians are used to being the weirdos, and have far more experience escaping the whims of industry.
Weaknesses of a CS degree:
1) At the point of graduating college, it is likely that your experience working with teams in a production-like environment will be minimal. I think I did, hmm, three labs like this? None of them produced anything close to a real software product.
2) Academics don't work like industry. I worked at the university for a bit after graduation and we still didn't use, e.g., source control, testing, etc etc. Pick your favorite best practices test, we would have scored negative. It took me a few years to learn better habits.
3) You tend to do a lot of stuff which has little relationship to what you'll be doing for the rest of your life. If I were dictator for life of the CS department, I'd have kids exposed to web programming very freaking early, because it is much more likely they'll end up doing that than fat Java client apps (what we actually did in school).
4) There are a lot of soft skills that go into engineering that some schools give short shrift to. I will defend this one to the death: the most important skill for an engineer is oral/written communication, and their ability to actually produce stuff is a distant second. We had one class on technical writing.
We have our whole lives working in the real world to pick up those things that the University environment can only really introduce us to.
I don't want to waste my CS education on just getting a simulation of what the rest of my life will be like, I want to learn CS especially the things that might be harder to learn via experience in industry.
I'd certainly agree that unis are more competitive at teaching theory (algorithms, math, AI, data-mining, simulations, etc) than practice. When they try to teach industry practice, it's in danger of turning into those nightmare OO courses - all pretension and no idea
oooo, a John isa student (a subclass of person); and Mr Smith isa lecturer (which is a subclass of academic, which is a subclass of person), and our tutor is an example of multiple inheritance!
The devil is in the details! Stuff you think isn't a big deal As an undergrad CS student might well be a freaking big deal! Example: in my current project, we've found that in the new system version, the vendor's "type and the list selects the matching prefix" functionality kinda works, but slowly. (As in, over 3 keystrokes a second is too fast and the system loses track of the full prefix.) To our users, it doesn't work at all. Their accustomed workflow is disrupted across the whole app, so I have to go and patch it.
In academia, stuff like this is considered a triviality. In the real world, it matters a lot!
Different schools, different methods, I suppose.
I think you are falling in to a trap a lot of people fall in to. There are a ton of cliche comparisons I could make, but I'll spare you.
If you want web programming, you should just get yourself enrolled in a technical school that will teach you all of that in 2 years.
- Perform Unit/Functional/User Test
- Use version control
- Be exposed to a variety of both tools (editors/ides) and services (github.com, getexceptional.com)
- Work in a distributed manner
- Get real world feedback on your application from people other than the Professor
- Work with more than the String class of your language
- Collaborate on related, but not dependent, code in the same codebase
- Learn different patterns (Observer, Singleton, etc)
- Develop the way you would in the real world (Scrum? Waterfall? You name it. No real cowboy coding)
when doing web development than doing most "Random Number generator" or "Porpuquine" projects assigned in CS to show recursion or proper OOP.
I'm probably biased though, as I'm a web developer who went through the Java bullshit and didn't learn a damned thing until I actually tried to make a web application that someone other than my Professor of Data Structures and Algorithms II was testing/grading/using.
ADDENDUM: As for the CS portion of the degree, learning all the aforementioned skills will hopefully allow you to more easily grasp things like the difference between PSPACE and NP-Hard problems, Genetic Algorithms, AI, etc. I'd hope to god that someone designing AI would at least use version control on their application. It would be a nightmare to find a bug that sliced a person in half when they are in surgery without a tool like "git bisect" :)
Abstraction is so much more than "Apple is a Fruit" style OOP. Objects, processes, ideas, language... a focus on abstraction from the start is one of the reasons I think the SICP videos are worth watching.
Programmers must learn to develop their own sense of style in their code. Practice writing things until they flow and feel natural. Coding is communication, to you, your team and your computer. Try not to stutter.
Keep in mind I studied in NZ and was generally a terrible academic, we are far away from anything like MIT...
I do think this approach where there are lots of specialties (Software Engineering, Game Prgramming, Systems Programming, etc) allows people to get experience and training relevant to where they're going in life.
All that said: I really really really want to see realistic mentored debugging go on in a CS program. I want to see 2 students and one experienced adult, sit down and learn next to someone the ins and out of all the modern debugging techniques (debuggers, binary search, profilers, memory leak detectors, etc). None of the classes had anything like that, and that was the biggest thing missing that comes to mind.
The second biggest thing is realistic estimation methods! Until I read a book on it and practiced a lot, I was horrible at it. Very little was taught on actual methods to make useful estimations.
(Here is a non-aff link to the book who's material I'd like to see covered in a college CS curriculum): http://www.amazon.com/Software-Estimation-Demystifying-Pract...
If you read CACM academia thinks CS is missing students. They want to dumb it down to get more people in, especially less geeks and nerds, and more women and minorities. They are looking at ways of dumbing it down and making it appeal more to the masses.
I think in general CS is missing industry contribution. I would like to see more commercial enterprises writing papers, submitting at conferences, doing research, and training graduates to become developers.
Some CS programs seem to have buckled under pressure and turned into learn-java trade schools. These places are missing computer science. Some try to simulate the working environment and focus too much on working on larger projects in teams because that is what industry wants, but we have our whole life to do that and uni can't teach that well anyway, you need real experience for that.
But in general, I think most CS programs, especially after year one are pretty good, and teach what they should be teaching.
I like them to teach the classic core CS disciplines, theory, algorithms and data structures, AI, databases, software engineering, graphics, and they should focus more on leading edge stuff that is perhaps not so well used in industry but provides interesting scope for further research. I think they should deliberately use languages, paradigms, and tools that are not (yet) mainstream in industry.
Fortunately, it appears to be a relatively small drawback. Particular details may change in a few years anyway; being prepared for the change is more important, as is being capable of learning, and this normally comes with a solid foundation which a good university program can provide to a good student.
This is a very general answer. I apologize that I cannot give a more detailed answer; my exposure to the CS education was through a graduate school which gives a different perspective.
1. Contribute to a FOSS project, or start and manage your own. Alternatively, build a product and try to monetize it. This pretty much covers everything that your CS degree claims to teach you. Now we can move on to more important things like ...
2. Learn to meet and talk to new people. Learn how to be comfortable around people, and to make other people comfortable around you.
3. Learn to speak in public. Learn to clearly present your views and opinions to an audience.
4. Meet women. You won't get time to do this once you're a Silicon Valley billionaire ;P
5. Learn how to negotiate. Learn how salespeople and negotiators employ simple psychological concepts to get people to agree with them. Learn how to protect yourself from these people.
6. Study non-CS subjects. Psychology, economics, music, art, whatever. It's critical that you broaden your horizons beyond standard CS topics.
This is what I've figured out so far. If more experienced people have anything to add to the list, please do :)
Does that mean you're a sophomore? If so, you've probably just learned the most basic foundations of CS. I've done 6 years of CS (I was a CS major and I'm just about to finish my master's) and I'm pretty sure I've become a better programmer every year. Maybe I sucked to begin with, in fact I know I did, but nevertheless, I've gained a LOT from my extended education.
I attend IPU, which is more or less a Java school. What they teach us here is geared towards the lowest common denominator. I can confidently say that I could've learned everything IPU has taught me on my own. I've been teaching myself since high school, so I don't find picking up new stuff very difficult. If there's something I don't understand, I ask someone on IRC/Reddit/StackOverflow/HN. The world is full of smart, helpful people you can learn from :)
YMMV, depending on the school you attend.
Technology has a bad habit of arriving, and then the mess being sorted out later. It'd be great to get those creating it to realise the impact they will be making on society and do something about it before it creates the mess. Of course, regular feedback loops on how it's all going throughout the lifetime of the project are also necessary -- just look at the facebook debacle recently! And the Google wifi scandal! And I could list a whole bunch of other seriously problematic incidents in the last 10 years (which, incidentally, is around about when I had just started my CS degree).
I realise this is mostly a pipe dream, and that most CS students snore through any "ethics" classes they may have, but instead of these "why are we here?" classes, perhaps integrating ethical design principles into the general procedures for software engineering would be a start.
It would also be great to observe or join an existing project -- a project that has been around for some time but still maintains a strong momentum -- and learn from the way it works. That is, short cut some of the process by observing others that have overcome some of the challenges.
This should also teach you that there is so much still to learn about successful software projects that you'll be doing it your whole career :).
So, my advice: Learn to write code in a team.
It doesn't mean you have to give up a broad education on underlying computer science principles to get these either, just include things like presentations and written reports with the theory work as you go. Team work is a contentious one as well. if there is team work you have to mark each persons contribution to the team rather than the team as a whole. I have had the experience of being lumped with some pretty terrible people work ethic and skill wise and had to produce something as a team.
Degrees will always be "inadequate" compared to the fast changing world. What is important is that it gives you tools to apprehend your work and even perhaps your life.
In your case, computer science (mathematics) and fundamental languages are a must (Lisp, C, assembly...).
Get the basics right. You'll have all the time in the world to learn "business stuff".
I think this is awesome compared to say another subject I have now where there is two lectures on the semantic web which seemed to describe the exact same stuff that has been around for the best part of a decade.
Not just diversity of gender (or, say, race). Also diversity of life experiences, ideologies, etc. One of the best things about having studied in an American university for me was the number of international students actively participating in school life, together with and just like American students. Such a contrast with a Soviet university where I had studied before that and where all international students were isolated in their own academic groups, we had almost no interaction with them.
Stuff you learn there is valuable. But you won't know how to build things unless you--well, try building things.
Most cs students graduate without experience making products that normal (non cs) people can use. That's a shame, especially when you consider that some killer products (iPod, Basecamp) have largely been successful due to user interfaces.
In other words: UI Design is to Computer Science what Industrial Design is to Physics. In that respect, they should bring more programming to the "art" programs (especially digital art), and less art to the computer science programs:
You don't really need to know the nitty gritty of hash tables, the difference between bubble sort and quicksort, how to implement Djikstra's Algorithm or how to build a blazing fast trie to build a beautiful+easy+informative interface to complicated software.
(You may, however, at some point, need to hand off your completed work to someone who does - depending on the platform)
CS graduates don't know how to ship a product, they don't know how to deal with clients, they don't know jack shit about the real, someone has to pay the bills somehow, world.
+1 to lleger from me.
A few people said something to the effect "the real world is missing". It's true. Majority of CS programs appear to be years out of date with respect to the industry practice, whatever "the industry" happens to mean for you personally. To some extent this is inevitable, though we could be doing a lot more to address the issue - say, as someone suggested, via a separate course, one that is updated biannually. A course like this could and should include a discussion of things like source control tools, build tools, etc. Things like resource management idioms in languages of the day should be discussed explicitly: too often graduates of CS programs assume infinite resources or automagical cleanup, especially if they come from a background in languages that are garbage collected. A course in parallel programming presented via different languages and their approaches to parallelism would be fantastically useful; I would pay money to see it on OCW or similar.
Somewhat related to Dijkstra's complaint is the following concern: people are terribly bad at thinking about complexity, at all levels - from a single source file to a system composed of hundreds of separate processes. We would do well to discuss managing complexity explicitly in CS curriculum: to make people think about difficulty of maintenance, changes to a working system, rollout procedures for uptime, etc. A seminar on the subject of complexity management, from small to large scale, with examples, would be very useful.
Incidentally, this is something I have been thinking a lot about over the last two years, in part due to conducting a metric shitload of interviews for intro positions. There are things that my CS program was missing, and things that a majority of programs out there seem to be missing; understandably, what I can say mostly applies to the former.