He is comparing Windows 95 to Ubuntu 11.04 (if you follow the link in that sentence)!
> They [devs] did not bother developing for other platforms because those platforms were economically irrelevant and the Microsoft developer tools worked.
Then he doesn't even make the connection to virus writers targeting Windows between 199x-200x.
> Windows security issues are everywhere and it did not need to be so.
Sorry, but that's mostly due to the desktop market size and Windows' share of it.
Everything after Windows XP had security at its core.
Blame the users who are clueless, that are emailing viruses to all their contacts, download Trojans and warez with backdoors, etc.
And again, he is comparing decades old MS OSs to latest versions of Linux and OS X.
> Nowadays nobody under thirty writes anything on Microsoft developer tools unless they are demented or brain-dead.
Completely false statement.
No, it's not. It's mostly due to Microsoft ignoring security for years because it wasn't important to them. They didn't have to have everyone running as root by default in all versions of Windows before Vista (AFAIK in XP Home you can't actually set up restricted users). They didn't have to have lots of open ports offering things like RPC to the world. They didn't have to have all files executable by default, based solely off the hidden part of the filename in AnnaKournikova.jpg.exe.
There are now supposed to be 300M Android devices worldwide, which is within an order of magnitude of Windows' numbers 10 years ago, and you don't see Android phones being compromised remotely within fifteen minutes of being connected to a network. There's no equivalent of Blaster or Sasser or anything close to that level.
It's partly due to Windows' market share that it got targeted so heavily, but those opportunities wouldn't have been there if they hadn't ignored security for so long.
> and you don't see Android phones being compromised remotely within fifteen minutes of being connected to a network.
Again, why do you (and others) keep comparing today's Linux/Android/OS X OS with a 10-15 year old Windows OS.
Windows security has been at its core since after XP, and by all knowledgeable accounts is just as good as Linux's ... as long as you know how to use it / deal with it. Today 95% of the problem is clueless Windows admins, and bad user decisions.
As far as my own experience goes, I've ran Windows 3.1, 95, 98, 2000, XP, Vista, and all the rest never having been compromised. So it is possible at least.
What you're doing is the same when people complain about IE 6 vs. the latest version of Chrome...
IE6 came out in 2001, and at that time was the most standards-compliant and feature full of all the browsers on the market (well, except for IE 5.5 for MacOS).
> They didn't have to have everyone running as root by default in all versions of Windows before Vista (AFAIK in XP Home you can't actually set up restricted users). They didn't have to have lots of open ports offering things like RPC to the world. They didn't have to have all files executable by default, based solely off the hidden part of the filename in AnnaKournikova.jpg.exe.
Of course they had to do all that. The Windows users back then were generally not very savvy and anything that got in their way was a disaster waiting to happen. Also it was a different time. Even today most Windows home users don't even understand the file-system with it's drives, devices, directories, subs, and files. And you wanted them to understand user security and how it plays with applications that they ran? No.
> but those opportunities wouldn't have been there if they hadn't ignored security for so long.
I guess they should have gotten a time machine to the future to pull all that work and knowledge back to the past. Windows XP should have been based off Windows 7.
My point is that what is possible today, was not possible 10, 15, or 20 years ago both from a tech and user point of view... Just because someone can do OS security good today, dosn't mean you can blame someone else for not doing it good decades ago.
Blame the users who are clueless, that are emailing viruses to all their contacts, download Trojans and warez with backdoors, etc.
I also blame MS for having insecure configurations by default. It's possible to secure a NT system quite well but there were a lot of compromises made in the defaults for convenience.
And yet, Unix/Linux had and has a much larger portion of the Internet-connected server market share but without the constant stream of critical vulnerabilities that Windows servers endured in the early 2000s. By your logic, Windows should have been a safer server choice because Unix servers were constantly falling to new attacks.
A couple of things...
1. I really don't know what the Linux server market share was in 2000 in comparison to NT server. Nor what the break-down of kernel vs user-space patches and vulnerabilities are. It also makes things much more complicated when you consider that the two were used pretty much in different situations and for different purposes.
2. That quote was in relation to the desktop/home-user market.
So I'm not going to go there as I don't want to compare apples to oranges, and on limited knowledge.
> By your logic, Windows should have been a safer server choice because Unix servers were constantly falling to new attacks.
How you're getting that from what I said makes no sense to me.
I'm 29 now, switched to mostly doing C# two years ago, and I love it. Its mix of pragmatism, familiarity and modern constructs is unparalleled, and no other truly modern language has this good IDE support.
The only thing I don't love about it is the vendor lock-in, but in reality this is a smaller problem for many applications than it seems.
I don't even get the complaint about vendor lock-in. What does using C# lock you into (especially compared to the use of other languages)? Sure, it's possible to use C# in such a way that you're stuck with microsoft tech, but it's not mandated that this is the case.
The lock-in is mostly such that if you code in C# on Windows first, getting it to run on Mono later may be difficult, if you didn't pay close attention to what you were doing from the start. In this sense, it's rather different than e.g. Java.
But not too different from all that POSIX-specific Ruby and Python code out there, admittedly.
You can use Microsoft developer tools to create platform agnostic code. So I fail to see how using a great product would make somebody "demented" or "brain-dead". If this attitude is common in other parts of the world, than it is no wonder that it is hard to find people who know MS languages well.
I hate to suggest this, but write that statement down and put it somewhere you will find it when you're, say, 40. I've known a number of Microsoft-dedicated (and Apple-dedicated, for that matter) developers over the years, and I can't think of any right off-hand that are still programming.
I'm not too confident in his ability to assess technology though, and I think he may be a bit premature on this MSFT call.
I do believe that Windows 8 is a mistake made out of desperation. Given that the change in interface will confuse a lot of people, I think enterprise will be wary of making the switch because of all the massive retraining it will require. Any type of little change in enterprise environments will always require retraining, so I think enterprise adoption will be very slow to adopt. Maybe MSFT will wake up and change the interface back and have some sort of switch to change back-and-forth before it ships a final version though.
* Prediction 2: this will accelerate, rather than slow down, the rate at which enterprises take their enterprise specific software into platform independent programs*
* Prediction 3: by stuffing this up Microsoft has just about lost its bet on moving the retail computer market into docking cloud computers. Apple will do this. And they will do it by stealth.*
Unfortunately, Microsoft seems to have survived Vista experience without too much injury. And having had some experience with enterprise platform movements, accelerating a "glacial" speed is still pretty slow. And finally, I still have doubts about Apple's capability and interest in the general "retail computer market". (Although I could be wrong about that.)
The business community is now openly doubting the future relevance of the Windows platform!
He has never been particularly bullish on Microsoft. He wrote about his "new" concerns nearly two years ago. [2]
[1] http://brontecapital.blogspot.com/2010/08/microsoft-accounti...
[2] http://brontecapital.blogspot.com/2010/09/microsoft-laid-bar...
Windows ME sucked. Windows XP was great. Windows Vista sucked. Windows 7 was great.
At least that was the perception. Reality is more nuanced, of course.
And business is very conservative, they never upgrade immediately.
For business, the alternative to Windows 8 is Windows 7 or Windows XP. It isn't OS X or Linux or Android. Microsoft does better if they switch to 8, but they don't lose if they don't.
Yes, there were Windows NT and 2000.
And an entire line of server products.
And applications, games, hardware, and a search search engine.
Not to mention enterprise services, etc.
The article is all about repeating the standard narrative.
That said, Windows is already a minority computing platform (in terms of new devices sold) so perhaps it cannot be as complacent as it has been in the past. If Windows 8 flops monumentally, when Windows 9 ships in 2014 we may all be docking our pads or phones to a keyboard and/or 2160p wallscreen to use virtualized Windows (when we have to) running legacy software and ten year old microsoft licenses.
1) Windows 8 is changing Windows in pretty fundamental ways (being touch first, leaving the "PC" behind), and I can't foresee anything but minor changes in Windows 9 to this direction. I doubt they will try to go back to making Windows 9 a true successor of Windows 7. That seems very unlikely to me now. Microsoft is all-in with this tile-based version of Windows. Even their logo has changed to reflect that. So businesses who don't want to stay on Windows 7 forever, should take into account alternatives.
2) Even if you hated Windows Vista, there was nowhere to turn in 2005 but Windows XP. Now there are some pretty good alternatives, and people are getting used to using different operating systems than Windows, which I think is a huge deal, because it's usually very hard to convince users to use another OS.
So long as they need it they'll pay. And, unfortunately, business needs whatever is backwards compatible with Windows XP.
Hmmm.... i am 29, i write C# and .NET for a living... there are a few lads in the office that are .NET Dev, under 30... I must be in a black hole then...
But seriously, I'm afraid he is close to the truth. As a career Windows programmer, I have great fear for the future after having tried the 2012 preview -- I've never felt so lost on a computer in my life. For my own selfish sake, I keep praying they'll make some changes before final release.
I'd love to know the percentages of independent Windows devs that make a living selling software vs percentage of Linux devs that can make a living selling software. Sure, big companies will employ both, but how does it work out for the small guys?
Edit: I will admit Linux and its ecosystem are looking more and more attractive all the time...
its a bit of a learning curve (a very small one) but once you're over you'll be glad for not having to browse through the Start> All Programs > XXX > yyy mess ...
I imagine most of the under-30 devs working on MS software are being hired by established companies to work on enterprise type software or for consultancies building websites using ASP.net etc.
What about people under 30 who are starting their own businesses from scratch? For example startups, they tend to be using stuff like Ruby/Python/JS etc.
So the question is , what happens when the old guard start to retire? How many will stick with MS because that is what they have used to that point , or will there be interest in switching to newer tools to build newer systems?
As someone who was under 30 during Microsoft's heyday (90s) I can say that most developers under 30 then also didn't use MS tools. Borland tools ruled the roost until VC6 and VB6 (around '98). And most young developers out of school back then knew Unix, not Windows.
My point is that it's a myth that there existed some time when all young developers used MS tools. In fact I'd argue that in terms of mindshare MS devtools are near their all-time high in popularity now -- it's that the Windows OS isn't as popular as it once was among the under-30 crowd.
All software we right has to pass tests on windows and linux (we'll add os x when we get an os x computer). I know we're not conventional, but we feel like these are the best tools.
Anything else is just an argument from popularity and isn't worth our time.
"Nowadays nobody under thirty writes anything on Microsoft developer tools unless they are demented or brain-dead. Firstly the kids out of the colleges know the platform agnostic stuff well. Secondly when half the computers leaving factories either run iOS or Android (that is are smart-phones) nobody sensible will write in a way that does not allow easy porting to these platforms."
Kids out of college know what their college specialized in. This is either Java or Visual Studio, and chances are when they graduate they'll get jobs at a company making corporate desktop software on Windows. The best way to discourage an aspiring programmer is to tell him/her "yeah it works, but it's not cool."
I see most people saying he is wrong here, though I can't think why.
Are there people who thing A. writing software for windows? B. are targeting the windows phone? C. deploying to a cloud (private or public) using Windows servers?
If you are targeting a phone, it is most def either iOS or Android, probably both. If MS, a distant third (and at that point you'll probably be using PhoneGap, Titanium or HTML5).
The number of people actively developing new desktop apps for Windows has to be tiny. Maybe even smaller than tiny.
And if you are deploying to anything other than Ubuntu, you're crazy (and potentially fiscally irresponsible...BizSpark not withstanding).
I get that some people might be using the dev tools, though I would wager (no numbers on this, just gut) that the number of MS Web Devs is far, far fewer than the same open source web devs (PHP, Ruby, Python, NodeJS, Clojure etc).
So, I don't get why people say he is off base.
Frankly, the only people I can see still using MS stuff are the big corporates. IMO, MS is riding the long tail into obscurity. Though, with their financials, it would still be a long, long tail.
> And if you are deploying to anything other than Ubuntu, you're crazy
What? Did you ever see deploying to Windows servers? There's a reason AppHarbor's documentation pages are so much smaller than Heroku's: Deploying a .NET web app is peanuts. Through Microsoft-only-means, it's done with a single buttonclick from Visual Studio. This works really great. There's a convention as to how web applications are structured that is very widespread and supported by all relevant tooling.
Really, there may be many reasons for not choosing .NET for cloud apps, but server support is not one of them.
Secondly, you're forgetting a major category for .NET developers: devices. Office multifunctional printers, cash registers, machines in factories, any mid to high tech equipment really. Windows has a massive market share here, and lock-in is only one of the reasons why this is going to remain. For example, developing a touch screen interface for an ATM using WPF is very, very easy - definitely among the best options out there. The entire world may be moving to the web, but for devices there is no strong benefit in doing so. Why make a built-in webserver and a windowlwss browser if you can just make a decent native app in half the time, using half the resources? Also, the price of a Windows Embedded license isn't very interesting if the device you're selling costs a few thousand dollars a piece.
Admittedly though, I've no idea how small or large the dev market of machines and equipment is. But it's really pretty sizeable, much bigger than the average HN world-vision warrants.
A lot of software is never seen by consumers, don't forget that. A lot of it is never seen by humans at all.
Sorry but in a realistic production environment, not just a dev test environment, this is not true. Even Scott Hanselman pointed out ASP.NET has a terrible "deployment story"[1] compared to other options. I'm currently working as an ASP.NET MVC 3 developer and I've been really disappointed in Microsoft's stack in this regard. Life was a lot less stressful back when I was doing Python and PHP deployments on LAMP stacks.
To do it right you'll probably end up rolling our own Powershell scripts to do 1-shot deployments. These work great - but again - it's essentially the same story you have in the Linux world.
[1]: http://www.quora.com/Chandra-Sekhar-2/Posts/Excellent-unbias...
Wrong. Windows desktop application development is still huge. Take a look at sites like download.cnet.com and look at how many apps get added each day. And this generally doesn't include the huge ecosystem of things like Office add-ins.
The Window's market isn't a growth market right now, but it's a market where there's still a lot of code being produced for it.
That's only one part of the cake however. Think of all the companies out there. From engineering to banking to manufacturers of Q-Tips.
Despite this:
-Qualified .NET developers continue to be difficult to find. Also, they make slightly less money than their Java counter-parts for reasons I don't fully understand.
-The 00's are regarded as Microsoft's "lost decade"
-Microsoft's tools gained little traction among startups... although MS-based ones do exist. The most notable being StackExcahnge however that was created by industry veterans not out-of-college whippersnappers.
Don't count on computer science students to create Microsoft's future. Many will drop the major for something easier.
The gamedev community is very MS centered. There are a few people doing handheld stuff, or homebrew console hacking, but the majority use Windows and Direct X. A lot of people use C#, or even develop for the Xbox 360.
If I compare Microsoft's handling of the OEM PC industry of the 1990s to Google's handling of the Android ecosystem, the latter is utterly laughable in comparison cough Nexus 7 Screen issues cough. That alone is reason enough for me to think they could return. They just need the right leadership. And if you think such a transition is impossible, compare the Apple of today to the Apple before Steve Jobs returned...
So is this, in addition to careful attention to typography, the explanation for why Macs were big with designers? (Many designers do the vast majority of their work in a handful of programs, and one in particular.)
Python and Ruby might be first class languages on the web, but they're never going to be on the desktop/mobile. (Yes I know about things like RubyMotion)
No devs under 30 work on Microsoft stuff? Even if you take that to mean "No dev under 30 wants to work on Microsoft stuff", that is ridiculous.
What? How are they accessing these virtual machines? Mind meld? In most cases where companies use VDIs the desktop machines are the standard old Dells and HPs because they actually cost less than "dumb terminals" (aka thin-clients). And that's accepting the questionable notion that VDIs are the future.
Nowadays nobody under thirty writes anything on Microsoft developer tools unless they are demented or brain-dead
We have been on a hiring binge lately and it is very difficult to find candidates who know anything but Microsoft tools. Sure they might know github, but there is a very substantial part of the workforce that stills crawls into Microsoft's bosom.
In general this blog post is completely detached from reality. There is the "startup" culture, of course, where everyone runs an iMac and develops iOS and Ruby/MongoDB apps for their EC2 cluster, and then there's the many magnitudes bigger general computing world that holds zero similarities.
Probably with Citrix-compatible thin clients. There are a ton of those out there.
...it is very difficult to find candidates who know anything but Microsoft tools.
That may be true in the US and some parts of Western Europe, such as the UK, but what about the rest of the world? The author of the blog post, John Hempton, is a hedge fund manager who works out of Australia but invests worldwide, and he often thinks and writes in terms of global trends. What is your experience trying to hire candidates outside the US?
Australia may be the exception to the rule, though I doubt that a quarter of Australian CS majors go on to work somewhere using python, rails, or PHP. A good number probably go to java, a compiled language, or iOS dev, but if they're going into web, i'd be hard pressed to believe that a majority aren't going into an ASP.NET operation.
Claiming that most developers under 30 don't use microsoft platforms is just pish-posh. They just don't learn it in class, and don't do it in their free time. That doesn't mean their first job isn't going to be a DB analyst on some 10 year old VB.net application.
Why would you post this reply when I specifically address that in the next two sentences?
That may be true in the US and some parts of Western Europe, such as the UK, but what about the rest of the world?
India is overwhelmingly Microsoft-centric. Eastern Europe is overwhelmingly Microsoft-centric. Much of Russia is very Microsoft-centric.
But nonetheless your oddly defensive argument (know the author?) sounds suspiciously like the "oh yeah, they're big in Germany" retort. But anyways what I am replying to is what I see as a immediately ridiculous claim that people under 30 don't know Microsoft, when the overwhelming proof says otherwise.
The author of the blog post, John Hempton, is a hedge fund manager who works out of Australia but invests worldwide, and he often thinks and writes in terms of global trends
This is one of the stranger appeals to authority that I've yet read. When I think "technology trends expert" I don't think "hedge fund manager". That someone invests money gives them zero authority on technology trends. I should add to notice that my business is hedge fund technology, so the circle is kind of complete here.
We have been on a hiring binge lately and it is very difficult to find candidates who know anything but Microsoft tools.
Geography.A buddy in Louisiana who was looking for a job tells me there's nothing but Microsoft stack.
Where I live, it depends which side of the lake you live on.
This may depend on a variety of things, such as the job location and desired skill level, but this blanket statement is false. For example, it seems that a significant number of Hacker News participants - who are developers - develop in other languages and platforms.
... completely detached from reality.
My personal opinion is that developers who don't learn other languages, platforms, and tools are completely detached from reality. There will always be (perhaps seemingly a majority of) people where being a developer is just a "job" and learning new things isn't necessary. But, again my personal opinion, this ultimately harms yourself.
Finding good people is hard. Many others have written about this, but don't be discouraged that there's only Microsoft developers out there.
Absolutely. And I don't begrudge those people- I initially learnt .NET myself and then broadened my horizons, but many of my previous co-workers have families, time consuming hobbies or other such interests. There's not inherently wrong with having "just a job", if you're content with other things in your life. It seems like a specifically (oddly) US-centric view that there should be anything wrong with that.
The blanket statement was a personal observation about our own hiring, so I can assure you with complete conviction that it is not false. Further I addressed the alternate universe of the start-up world, comprising a tiny, tiny percentage of software developers, and that is what HN caters to.
Just to be clear, I don't like that most candidates outside of the startup-sphere are so Microsoft or Java centric. In fact it is a battle that we constantly have to fight (hire somebody and then have to argue every single decision that isn't the Microsoft Way). Yet I have enough real world development experience that I found that claim so ridiculously detached from reality that the author lost any and all credibility on tech matters.
The author says it got better, but in reality Unity already overtook Windows' UI and it will overtake Apples UI with one of its next iterations.
As a younger (though not that young) developer I have to agree that using MS developer tools is the wrong way of doing things for most new software, not all of them though, it sure has its uses. But I am strongly against proprietary software that only runs on one OS as that will only lose you business. Even the game platform Steam seems to have gotten to that point, a Linux port is on its way.
You would certainly have limited luck trying sell a C compiler or package manager to Linux users for sure, but there are certainly areas (including some developer tools)where there could be a market.
If you released (say) an image editor for Windows and Mac you are competing with Photoshop as well as countless other programs, whereas if you release for Linux you may have less potential users (although there are still an estimated 30 million) but you are competing with The Gimp.
There is a subset of Linux users who would never consider any commercial software, but this is pretty small percentage, at least that is certainly what Valve is banking on.
Mac users do expect to pay for software sure, but Windows users? Not so much since Windows seems to be the platform with the highest piracy as well as countless horrible "freeware" programs.
Absolutely. Unity in 12.04 is very solid. There's still a lack of settings options, but I don't think that warrants the whole thing being called a "failure".