Of course this about the server speaking HTTP/2.0. We are talking about a potential client-server protocol upgrade. The fact that millions of applications are written to the 1.1 spec shouldn't be a factor in working on the 2.0 spec.
I believe that HTTP/2.0 will necessarily require applications to be rewritten. Or at least frameworks will need to be updated in order to take advantage of the new features. And you can expect some pain when other features are taken away. If there is a push to make cookies more 'optional', you can expect that clients will start to let users block cookies entirely. Would you rather your application kinda work for people but not those on 2.0 browsers? No, of course not, you'd rather it work for everyone. So why try to run a 1.1 webapp across the 2.0 protocol.
If you are hard-coding $COOKIE statements in your code, you aren't writing it to a sufficient abstraction to be able to survive a future major version jump. But there's nothing wrong with that. Major version jumps in a protocol are pretty rare, and your code will still work just fine as a 1.1 webapp.
If you're writing applications that expect to be dealing with HTTP requests, then of course you'll have to rewrite applications to run on a major version upgrade of a protocol. This is what will be expected. Major version updates shouldn't necessarily be backwards compatible, and that's the main argument of the post. If the update is marginal, there will be nothing to drive adoption of 2.0 over 1.1.
What I was trying to point out was that, (hypothetically) if HTTP/2.0 isn't backwards compatible, that doesn't mean that HTTP/1.1 and 2.0 applications couldn't co-exist on the same site (or even the same server).
Plus, let's remember, this is all hypothetical - we are still trying to figure out the goals of HTTP/2.0.