Which is not to say that universities shouldn't make these tools available and encourage their use for class projects. But that doesn't require them to "integrate version control into their curriculum" like the article suggests. The documentation for git is freely available on the web and anyone who wants to can read it on their own time and ask their fellow students about it.
The larger philosophical question here is whether higher education is a vocational program for creating good "team players" who are productive with the current technology on the first day of their employment, or rather to provide an education that teaches them fundamental knowledge and how to learn new things on their own.
Side note: I've taught a first semester CS course in a well-known university, so I have some experience with the difficulties that even some very smart students face while trying to pick up the basic concepts. Adding in version control would have made it harder for some of them to successfully complete the course.
I think that including these basic tools in an intro course (perhaps as a first or second class topic) would be a good idea. The minor waste of expensive education time will be more than offset by the added productivity in later courses, and a grading workflow built around test suites that one simply pushes code to would probably make students and graders lives easier.
An anecdote:
One of my CS courses spent a week or two going over C and its pitfalls. Now, one could argue that any sufficiently bright student should be able to pick C up incidental to completing a course on programming in 'nix, but the fact is that the additional material covering things like dumb use of unions and linker errors and whatnot saved the ass of several classmates time and time again, both in school and in later years.
You can't have a purely theoretical education, especially in a field where at the end of the day you need (to quote Zed) "programming, motherfucker."
Students will in all likelihood get far more mileage out of a fast intro to VCS and *nix over the rest of their academic and professional careers than learning one more esoteric data structure.
By similar logic, mechanical engineering courses should never require students to learn the rudiments of welding or machining.
To turn this around, why should businesses pay former students to learn about this on the job instead of paying them to be productive?
From my experience, there's a large number of developers (both young and old) out there who are very capable programmers, but seem to not spend any of their spare time learning things on their own after leaving school. It's insanely frustrating having to teach new junior developers how to properly use a tool like git when they've never even been exposed to version control, all while they're supposed to be productive. It really can become a waste of company resources.
Are they capable of learning version control on the job? In most cases, absolutely, given enough time and buffer for mistakes. However, I believe, as part of professional development in the field they chose, it should be something they learn outside of work and preferably before getting the job. University seems like a very valid place to at least introduce and encourage use of version control systems early on, if not at the very least in a higher level course.
Because it's expected that entry-level employees don't have a lot of experience. It's even possible that these young people have taught themselves all sorts of useful things on their own already, but haven't gotten around to teaching themselves version control or the particular database system or development environment that your company uses. I think it's a better investment to hire someone who is smart and a fast learner than it is to hire someone who happens to know the particular tools you currently use.
In the software field, we've somehow decided that it's normal for entry-level employees to have lots of practical experience. But in most other jobs, that's not the case: they learn at their employers' expense. How many new hires on Wall Street have ever used a bond trader's workstation? How many newly-hired railroad employees have ever driven a train? How many newly-hired lawyers have ever represented a client in court?
Then I get to my internship and have to learn Team Foundation Server... Sheesh.
Basically, I think it's important to teach Git (or other Version Control) to students because what you are really teaching them is good development practices. These take time to learn and get used to and Uni's absolutely should be teaching them.
The whole issue of "students should be able to teach themselves git" is a bit of a red herring, IMHO.
commit d6abf62d1981a6c16029d10a5ecbe0c5c494322b
Author: Biff Joe <biff@bork.edu>
Date: Thu Mar 28 12:21:11 2013 +0000
Change copyrights to my name (woo, dodged a bullet there!).
commit 9c66001752f643e8f2a3453662352cf26402cee3
Author: Biff Joe <biff@bork.edu>
Date: Thu Mar 28 12:03:01 2013 +0000
Change variable names.
commit e9b9a740ca52e6fc2843d2d5aed2865c56ace45a
Author: Biff Joe <biff@bork.edu>
Date: Thu Mar 28 11:39:44 2013 +0000
Import Nigel's code.I'd suggest to people doing hiring that the candidates who haven't even heard of version control (which some other posters have mentioned interviewing) are ones that you don't want to hire. You should be glad to have the easy selection criterion.