I'm sure this is going to get a lot of discussion, so I'd like to preemptively elaborate on the anti-modular part.
Object inheritance is the moral equivalent of copy & paste.
Suppose you have a class, and you wish to reason about its behavior. You will have to do so knowing that the function definitions therein will be ripped out and replaced with different things. Thinking about the behavior of this will essentially require people writing subclassses to look at the code for the superclass, and vice versa. You've saved the clicking of copy & paste, but not the need to share the code.
There is an exception where inheritance can be well-behaved, and that's where there are a few methods intended to be overridden, with the rest of the class providing meaningful behavior in terms of those few operations. A good example is java.util.AbstractCollection, which provides all sorts of useful operations in terms of the iterator() and size() functions. However, in this case you might as well use an ML module, which makes this relation explicit.
The argument in the blog post is pretty much worthless as they are not backing it up in any way or explain what they mean with very vague adjectives like "anti-modular". I have been programming functionally in my spare time for the last few years and there is certainly plenty to love about it, but the lack of a clear set of principles and language tools for decomposing systems into manageable modules I would actually view as one of it's main problems at the moment.
Decades of writing large-scale OO software, produced a large set of such principles, which applied thoughtfully produce software that really is modular, in the sense that you can keep track in your head of a large program. Things like law of Demeter, substitution principle, open-closed principle and so on and so forth. This imposes a lot more structure upon the code and simply makes it easier to remember things then just using functions grouped into modules/namespaces like it often ends up looking in FP.
Now, I'm not saying that there are some inherent problems with modularity in FP - I don't know the full spectrum of functional languages and how they approach this problem of grouping large amounts of code into meaningful pieces. But it doesn't seem that there is some form of a united and well-known approach to this in the functional world, so such claims of "anti-modularity" should be well-backed by arguments and examples of how does FP _help_ modularity. There are surely plenty of people wondering about how to write modular applications in functional languages - just look at the popularity of threads like this: http://stackoverflow.com/questions/3077866/large-scale-desig...
Because of the above, I'm really looking forward to the development of languages like Scala, which try to blend the benefits of functional programming like parallelisation friendliness, with the main benefit of object orientation - clear rules for program decomposition. Also, some of the issues are probably orthogonal to what kind of language is used (in certain bounds at least), for example look at this decades old paper:
http://www.cs.umd.edu/class/spring2003/cmsc838p/Design/crite...
Recursion helped me learn how to break problems apart. Working with lists and the list primitives all the time gave me a leg up in data structures. Anonymous functions taught me about abstraction. Macros taught me to write programs for clarity and let the machine do the rewrite work. Many of the analytical skills I learned from FP I carry over to programming in other languages and problems solving in general.
The biggest complaint heard was "I don't see how this applies in the real world". This is fair, the number of job listings out there that mention Lisp, Scheme, or Haskell are far outweighted by other languages (i.e. java, C, C#, PHP, etc).
I always responded that learning FP teaches you how to think about programming. One of those "give a man a fish / teach a man to fish". That response generally drew blank stares, but it's something I believe is true.
I remember my Berkeley compiler class implementing Scheme and APL subsets atop the gcc backend as well which was again a doubly good learning experience in learning other approaches to programming and their implications.
I have yet to see OO design taught well outside it's varying programming constructs well. What is the right level of abstraction for "finding objects" in the words of Bertand Meyer?
Yeah, but those aren't really the same reasons CMU is switching over. Scheme was historically taught because it offers the opportunity to teach computer science, as opposed to merely teaching programming. Most freshman "CS" courses are oriented around the latter.
The shift into Java/Python/$OOP_LANGUAGE was precipitated by the demands of the real world (eg, the job market). CMU is leading the pack in that respect -- it just so happens that FP is a better fit for the type of work their students will need to do, given the growth of multicore and distributed systems.
OOP has its place, but it is far from a panacea. I'm glad to see at least one school isn't hammering that into freshmen anymore.
Consistency, concurrency, parallelism, persistence. Yup we're back to basic distributed systems over larger networks.
The vast majority of developers don't need to know all of those things of course, i.e. forms enterprise or some level of RoR applications, but it is kind of interesting to see the resurgence in systems level skills needed in the industry. At one point OS and high transaction developers were down to a negligible percentage of practicing developers and we thought everyone would move to certification type education and high level abstractions.