One (extended) question though:
The first course is "a cutting-edge class in web application development for mobile devices. Not only does it use texts focused on practical application and cover tech like PhoneGap, Jo, Sencha, jQTouch, and jQuery Mobile, but it is taught by a real-world developer with decades of university teaching experience".
That's not a university course, that's a trade school course. Look at the "textbooks". Probably a useful one but it's not CS. I know you say "We not only teach CS/SE theory at the highest level, but also provide the practical implementation that prepares you to excel in the workplace." but to be honest that seems like a lie. I don't have much experience with teaching CS but I have some.
The idea that you could teach a practical (necessarily complex) toolkit at the same time or alongside high level CS concepts seems absurd. Students have a hard enough time getting those high level concepts to click but now they are mixing trade school toolkit training in at the same time? Those two goals conflict with each other. It's like using gcc internals for a compiler course.
I am so onboard with the online, just in time, at your own pace learning thing. But I have to say that the copy on this page has seriously dampened by enthusiam.
You're called "Turing College" and the only course is a trade school mobile app course covering mobile app framework libraries (at least they'll have to come back in 6 months for the new version of the course) and say things like this: "We’re teaching you to be a rock star, not just look like one on paper". WTF? Was brogrammercollege.blogspot.com taken? And blogspot? really?
I hope my impression is wrong, but I'm not coming away with a good one from this page.
The whole concept of being able to mix real world (trade school) courses with heavy theory is a tricky question. We have traditional theory heavy courses in the pipeline, but realistically, it is extremely difficult to bootstrap with theoretical courses. It's been tried before and typically fails quickly. Oddly enough, you can give away theoretical classes, and you can charge $10,000 a class for them, but you can't really sell enough at our $200-$400 price point to pay for the cost of developing the class. We believe you can teach both, and should teach both, but I'm well aware that we'll always have people saying you can't or shouldn't even attempt to do so.
Oh, one note on the textbooks for the course. Of course there is an image issue when you use Oreilly and Apress texts, but honestly, for that subject there just aren't any traditional texts that come anywhere near the level needed to teach the subject. Even those three have serious holes that Dr. Ostrowski has worked with the authors to plug in this course. If you know of a better text that we somehow overlooked, please drop me a note and we'll see if we can integrate it as we create V2 of the course.
I think a few things struck a nerve with me because I've had a lot of interactions with junior developers who identify with the HN crowd, the startup culture, rockstar programmer thing who show really strong anti-intellectual opinions about CS.
Then they end up reinventing the wheel poorly because they would never bother looking up the 30 year old algorithm that solves their problem on wikipedia, nevermind reading an actual published paper. Every time I have to throw away weeks of their work because they wrote their own crappy sorting algorithm or didn't google "bloom filter" I blame their educators and feel bad for them. Because it never occurred to them to think about the high level problem and see if maybe one of those ivory tower geniuses solved their problem already. Sometimes they know they could have but don't know where to start or feel intimidated by that part of the web.
I'm gearing up to do some recruiting for my startup so it's on my mind. "Turing College? sounds like my kind of people. 'Rockstar' programmer? sounds like those kind of people".
Maybe that's unfair and I shouldn't be so crotchety when I'm barely in my 30's. Some clarity on where you guys will fall on the theoretical/practical spectrum and how the theoretical foundation courses will support the practical courses in your curriculum would help people like me get on board.
Maybe some copy on your roadmap for the curriculum and your higher level ideas about how you'll be teaching?
So I doubt that "play[ing] their game better than they do" by pursuing traditional accreditation is really the disruptive strategy here. Blow up the whole rotten credentialist system and replace it with something very different.
(A meta-credentialing service might be a neat startup. With an explosion of non-traditional courses, certifications, and credentials, which actually hold up as meaning something? Communicating something here is a process, trust, and even data/statistics challenge – a nice community/tech opportunity.)
What a system like this really needs is a strong way for future employers of graduates to rate their relative ability (eg. 'according to our benchmarks this person's ability in python lies _x_ far between the average coder on github and [insert famous python user here]').
Perhaps establishing 'credentialing' should come before establishing a school?
Yes, the system is rigged against new entrants.
This is an interesting point. I happen to disagree, and I hoped that the author would go on to say exactly why it's wrong, especially dangerously so. It seems like the author rejects curriculum-based courses of study that provide a broad, solid foundation in favor of just-in-time courses of study, in which one learns whatever is necessary at the moment, right before applying it.
My main objections are that
* There are things you don't feel like learning that you would do well to learn. I had a lot of freedom to choose courses during school, which was great, but I am about to graduate with a lot of holes in my knowledge.
* No matter how smart you are, you would benefit from the guidance of a teacher --- guidance through a full curriculum that gives you a solid foundation and imparts onto you important patterns of thought.
More importantly, though, is the fact that since you only have 40 or so classroom hours in any class, and you have to teach to the average, it is extremely difficult to build up to a properly high level of skill in any particular subject. It feels hard while you are doing it, but after you graduate, you realize the people who have been focusing on the subject for a couple years are light years ahead of you. It's even more difficult to chain subject together to reach that high level. The closest thing we have is a generic 100,200,300 level system with some prerequisites.
How does this relate to just in time vs just in case? Even if you assume an identical breadth of knowledge, being able to sequence classes together in series instead of having semester and scheduling gaps means you go into the next class with more knowledge retained from the previous, which means you can build on your foundation in a more logical and efficient way and reach those higher levels that you just can't in a fragmented system. You can approach this from the ground up (building on higher and higher concepts), but the very nature of a JIT system means you can also approach it from the top down. That is, you can define the ends result or top level class, and then sequence each course to build up the fundamentals you need, just before you need them.
The point? If you are defining a broad base of skills, JIT allows you to master each one quicker and sequence them together to reach higher levels of mastery. If you need skills in the real world, JIT is the quickest and most efficient way to build those skills. The reason I consider disagreeing to be dangerously wrong, is that JIT is so much more effective at real education that those who bank on JIC for their future (students, schools, or countries) will find themselves left in the dustbin of history.
1) Retention sucks in the current model of higher education.
I've been thinking about this one a lot lately. I've been doing a one year masters where I'm taking two courses a semester and doing research. The depth with which I am learning things is night and day compared to the depth with which I learned my undergrad material. During undergrad, I was drinking from a firehose and just trying not to drown. I would turn in unfinished problem sets, not having learned the material, and move on with my life. I would sleep through classes out of sheer exhaustion.
Now, with just two classes, I'm able to learn things almost well enough to teach them. So one way to improve retention is just to take things slower.
Another model comes to mind if we consider how people study at Cambridge, Oxford, etc. I have not experienced it myself, but according to students who went there as exchange students (and students from there who came here (here being MIT)) it's pretty different. Students here are overwhelmed with constant work. There, it is a lot more self paced, with a set of final examination at the end (someone please correct me if I am not doing it justice). So perhaps self-pacing and working smarter, not harder leads to more retention.
Do you know of any sources for retention statistics such as those you cited? Some of them don't match my experience (for example, I would say that I spent 8-10 hours a week in classes related to my major).
2) Courses need to happen in a logical sequence so that they can build on one another.
When I first read your post I thought you were suggesting that students should, by themselves, pick what to learn based on what they want to build, in lieu of being guided through a logical curriculum.
The point about scheduling gaps is interesting. Scheduling gaps happen because it's hard to satisfy the constraints of so many student and faculty schedules. If you could take courses on demand, that would fix things.
3) Results driven learning can be excellent for motivation and retention.
When one talks about results, there is a fundamental issue of time scale.
Courses that say things like "When you're done, you will have built an autonomous mobile robot" are great.
But there are many fundamental things to learn, over a long period of time, whose benefits
* you might not see for a long time * are broader than you could have ever imagined (and hence the benefit would seem artificially low to you)
If you as a student get to pick the desired result yourself all the time, you might be tempted to pick shorter-term results. This can be catastrophic to your education.
I believe in forcing people to learn fundamentals of their chosen field --- fundamentals whose power they might not appreciate until later. Learning fundamentals (that you might choose not to learn if you weren't forced to) is fruitful in powerful and unexpected ways.
Take pure math classes. You learn analysis. Then you learn measure theory. Then you learn measure-theoretic probability theory. Then you learn stochastic processes. All of a sudden, financial mathematics becomes easy to grasp. But so do a host of other things. Signal processing, computer vision, statistical mechanics, complex multiagent systems, epidemic modeling, control systems.
I suppose you could have started on this path because you wanted to learn financial mathematics. But it probably would have seemed way too complicated and difficult. But if someone says they want to be an applied mathematician (a much "broader" and more long term goal than just learning financial mathematics), then they'd better take a ton of pure math.
Within the next decade, we will see the rise of the teacher superstar. They will have salaries/compensations comparable to movie stars except their performance will be teaching online to massive amounts of students.
We can already see a beginning of that trend with Salman Khan or Peter Norvig teaching an AI class online.
Two thoughts come to mind:
1) Salman Khan is arguably this already (although he's not a "teacher" in the narrow, institutional sense) - he's managed to educate thousands of students rapidly, and he's raised a significant amount of capital for Khan Academy (not to mention attracted quite a bit of media attention)
2) The best private schools, I suspect, house many potential "superstar" teachers. The problem is that the schools have most of the reputational value and reap almost all of the financial rewards. In a lot of ways, it's an information asymmetry problem; I attended such a school and had what I would imagine to be a very disproportionately high number of superstar teachers, but I didn't know which teachers were superstars until after I had enrolled in the school and taken their courses.
I can buy the fame for sure; the salary is harder: movie stars may be able to more effectively withhold their appearances/endorsement. Educators have more constraints on their withholding/spinning/endorsing behavior. But maybe!
How about a further prediction:
Once there are teacher superstars and they are considered a normal part of our culture the pressure to pander or appeal to the masses will result in the most famous being pretty crappy educators. It will be like politics and acting, superstar attributes are more important than teacher/leader/actor attributes.
Unless we get a really good way of measuring student performance somehow. If we can get that good enough then it becomes like sports, the superstars are actually among the best at what they are supposed to be doing. This is the optimistic prediction.
The first one is the realistic one.
I think you're looking at it wrong. You shouldn't be comparing yourself to universities. Don't compare your prices, duration and accreditation with them. You should be building something completely different from the ground up. Something that is viable in today's world, not trying to bandage existing university models to today's world. Like pg said, build your own thing, if it's really good, it will eventually replace universities without you even aiming for that.
Personally, I think you should focus much more heavily on the accreditation side than anything else. Just try to build a certification system together with existing tech employers, something that they would sign and put a banner in your website saying "company X approves this certificate as important for our selection process". That would get early adopters interested. After you have that. Offer your classes for free, make those as widely available as possible. Charge for the certificate and one on one help with those who feel they need it to get your certificate. Well, that's how I think this universities will actually get disrupted.
I don't think you can beat the current universities by playing the game by the rules they've set up to manage their competition. Just seems like you've lost before you've even started.
But, you have to look at what the larger impact can be with accreditation and Title IV. Even though we are strongly against student loans, being able to work with state and federal governments for grant and work study funds is the best and most direct way we have at hitting "free education for all" status. It's also the quickest way to move out of the relatively small group of students who don't care about accreditation to reach the much larger group that does. It's only by being a reasonable alternative to this group of students that we can apply real pressure to the current system. As long as we aren't accredited, the existing institutions can simply point to that fact and write us off and most people will listen to them.
Also, we have thought very seriously about the business model of offering classes for free (Udacity model) and charging for certification/one on one. The numbers we see for that model just don't work, even at a large scale. You have to focus in introductory classes with the broadest possible appeal, reduce the difficulty of the course so you don't get 10,000 complaint emails from students who can't keep up, use a MOOC structure that is more about cost control than maximizing learning. Plus, the conversion from free to paid user may or may not be enough to cover the cost of course development. Our model, on the other hand, allows us to pay our professors quite generously for classes that may not have 100,000 students attending and focus more on building our catalog instead of just building the user base.
Fees are a little higher but only marginally at around $8000 per 120 credits for international students. (A typical degree requires 300-360 credits)
Many degrees can be completed over 7 or more years if it's convenient. But I think there is a minimum time as some coursework is assessed to a schedule.
So there is definitely a market for this type of learning, and competition almost always benefits the customer, so good luck!
I don't think I agree with you there (or perhaps I just didn't interpret the point correctly). In certain fields of study - and perhaps software engineering is one - this might be true, but it does not hold generally. If tomorrow I find myself needing to write a good zero-finding algorithm in a new language, then yes, I can probably absorb that material quickly. If I find myself needing to model the temperature dependence of something using an esoteric branch of quantum mechanics, then good luck to me without 3 years of prior study in topics that didn't seem relevant to anything at the time.
Cost is actually the one area where we can compete very aggressively. Columbus state may be able to sneak under the $10k mark, but for an equivalent program we'll be just over $3k with a much more flexible and adaptable system. That number, btw, is with state and federal subsidies for the school, often upwards of $10,000 per student per year. We're coming in completely unsubsidized at a price that's less than a third of our cheapest competition with an offering that is significantly higher quality (we're competing with MIT for some of our professors).
Brand building really is the toughest nut to crack, and one what we have a couple different strategies for. Not really comfortable discussing those strategies on this forum, but let's just say we've thought a great deal about the problem.
http://news.ycombinator.com/item?id=3742455
Maybe we're all just seeing an end to the vague rhetoric about college "preparing you for life" rather than training you for the workplace.
If you look at the value proposition for a student, our web app class is extremely high. The material covered in that class can have a direct impact on job prospects, billing rates, or internal promotion opportunities, and it is such a difficult area to self teach that saving tens or hundreds of hours fighting though online tutorials is well worth the $200 or $400 that the class costs.
I'm not slagging you, by the way. Just offering my initial reaction as someone who would be interested in such a class, who is doing a CS degree at a traditional university, and who would like to see someone disrupt the higher education model.
I always browse without javascript enabled. Sites that load, and/or warn me correctly get a subtle mental nod and (maybe) even an addition to my safe-list. Obviously a blank white page isn't going to bring anyone back a second time...