This has similarities in type, if not in horror, to the development and subsequent spread of nuclear weapons. When we lost control of those secrets, it was a BFD [0].
But the NSA are - by definition - supposed to be security experts, so what are they doing letting themselves get hacked? They have effectively given away the nuclear football.
I'm shocked we're not seeing more blame in their direction on this one.
Microsoft has done NOTHING to show that things have changed since they colluded with the NSA on PRISM (https://www.theguardian.com/world/2013/jul/11/microsoft-nsa-...), and so anyone who believes that things have changed is a moron.
Remember, head executives at Microsoft are essentially part of the "shadow government" as they were privy to 1984-style surveillance that even much of congress was unaware of until the Snowden leaks. People at MS knew and said nothing. Executives at MS are closer to the NSA than most of congress. Let that sink in.
The problem lies in our defensive infrastructure and our ability to roll out patches responding to incidents.
It also lies in our security infrastructure: that cryptoworms are a danger speaks to a fundamental lapse in permission and process management systems.
The problem is corporate IT (or management) think they can create some sort of stable environment, driven by fear of having things break. Organizationally they need to accept that they are operating in a dynamic and hostile ecosystem and that the risk of worms is higher than the risk of some random app breaking on a windows patch.
Microsoft is responsible for their shit software getting exploited first and foremost. Seriously fine Microsoft and by day after tomorrow that 3500 security engineer number will jump to something realistic.
Instead what will happen is more tightening of the walled garden, overcharging of support/security contracts and propping up of another billionaire or two. I can hear the whisky glasses clinking.
Corporations do not get to set the agenda and the narrative. When they are allowed to, the results are very predictable - in this case Microsoft will make more than they loose. Who here disagrees that is going to happen? And who here believes that is right?
The answer is simple whether its Microsoft today or Facebook and Google tomorrow win-win should not be an option when such things happen.
There's plenty of blame to go around to be sure, but giving the NSA a pass for developing zero days is batshit insane. These guys are playing god instead of helping make infrastructure more secure overall, and it will not end well, even if they outcompete the Chinese or whatever other bogeyman they cook up to justify their power grab.
You know what? I'm starting to get excited for the walled garden to get more walls.
Native desktop applications get far too many permissions by default - its crazy that any desktop application, once running can register itself at startup, see all my files (created by any application), register system-wide keyloggers, take screenshots of other applications and download my contacts list, all without my permission. We don't let web apps do that, because web app developers aren't trusted by default. We don't let mobile apps do that, because mobile app developers aren't trusted by default. Why on earth do we implicitly trust any executable file run on the desktop so much?
Telling users not to double click on executables is obviously not working. Even for experienced users I have no idea whether some random app on the internet is trustworthy. Its a reverse lottery. I also suspect ransomware like this one would have been slowed down if it needed explicit user permission to read & modify files on disk.
We even know what the sandbox should look like, because we have two working examples in the form of the web and mobile. And we have sandboxing support & APIs in most operating systems. We're just missing the UI part.
I'm imagining something like:
- All apps get signed by the developer (Lean on SSL? Not sure the chain here.)
- The app needs to request capabilities from the user, like on iOS. "App X by Y developer wants permission to read the files in your home directory". (/ Read your contacts / Register at startup / Take screenshots / Modify these files).
- Capabilities can be viewed and revoked at a system-wide level in the control panel / system preferences.
Do we fine the person who committed the faulty logic, the reviewers, the entire community who "peer reviewed" it?
The point is: the NSA caused this particular problem. Steps should be taken be everyone to ensure something like this doesn't happen ever again.
This is an absurdly naive viewpoint. How are they responsible? What is their responsibility? How is it their responsibility when a state-funded group/actor targets their software and finds an exploit?
At some point you have to realize that 0days will always exist. It is an impossible task to expect software developers to ship perfect software.
They ostensibly maintain their capability to protect us, but this is a clear example of them failing to protect us. The focus on offensive posture is all macho and typical military industrial bluster. My point is that the offensive cyber capability is more about dick length than keeping the country safer.
Nevermind that the internet is a global shared resource that works best when we work together.
Also, MS haters are doing some pretty fantastic replays of the hits in this thread. I get that you don't like them, but "kill Microsoft" isn't the answer. Maybe there needs to be a model for assigning cost to vulnerabilities like this...to Microsoft and the NSA. Make them account for this in monetary terms and you will see change.
Are you saying that the choice was made my the NSA whole failed to report it, or suggesting that Microsoft colluded in keeping a known exploit open?
The problem is Microsoft, who wrote the exploitable software in the first place.
Most of services in Windows are run under two privileged user accounts (LocalService or NetworkService). Many of them are enabled by default and are listening on ports on external interface so the potential attack surface is large.
Microsoft uses programming languages like C++ that is very complicated and a little mistake can lead to vulnerabilities like stack overflow, use-after-free, etc.
Microsoft (and most companies) prefers to patch vulnerabilities with updates rather than take measures that would reduce attack surface.
Oh, and by the way Linux has similar problems. In a typical Linux distribution a program run with user privileges is able to encrypt all of the user's files, access user's cookies and saved passwords on all websites, listen to microphone and intercept kestrokes.
Yes, but as free software, it inherently has better solutions.
Using a proprietary operating system is like driving a car only the manufacturer is allowed to fix. You don't get to fix the flat tire, and when the manufacturer drops support, you have to buy a new car. If you don't, these situations leave you stranded.
Are you saying all of the major operating systems have poor security because they use "vulnerable" languages?
Absolutely.
You know, not too long ago, Linux used to run NFS on ring 0 too.
There was a good reason for those things, you can find them on the performance comparatives between CPU and network at the time.
https://thenextweb.com/microsoft/2015/09/11/microsoft-is-aut...
This one consumes me several gigabytes on my C drive without my permission.
https://www.tenforums.com/windows-updates-activation/55185-w...
This one acts like malware.
And this one: http://www.pcworld.com/article/3039827/windows/7-ways-window...
I don't know why I'd choose a operating system does that.
It pushed some telemetry updates, which arouses some privacy concerns (only after Microsoft's aggressive attitudes about Windows 10 promotion, before that I was OK with its telemetry updates. I'm aware sometimes telemetry tracking means good.)
And much more.
All of these behaviors make me think that I'd rather lose my data than suffer from these "features".
Me either. Stop using Windows.
And how sure are we that they didn't install security updates out of sheer laziness or hubris?
People who run systems that store sensitive information and systems should take computer seriously more serious than the people on Hacker News. I would never allow my my smartphone, let alone computers and servers to run unpatched software. Why is this acceptable for people who have critical systems and data?
Especially if the company that develops the os in question shows a track like this one: https://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=microsoft+w.... (Security)
I also wonder how long it will take before the shiny new anti-piracy instruments will be abused by a member of the intelligence community, a low-level politician or perhaps embedded into desktop OSes. http://pimg-fpiw.uspto.gov/fdd/50/148/096/0.pdf (You are not the owner of your files)
It's always easy to accuse the user rather than who exploited the vulnerability in the first place or who does not backport security patches when users obviously do not like the new versions of a software. - https://www.netmarketshare.com/operating-system-market-share... - https://www.extremetech.com/computing/227693-windows-drops-b...
Frankly speaking, Microsoft has gone too far into abuse, lock-ins and presumptions.
As a personal comment, I have an old Windows 7 laptop I use with some win32 software, I do not have the slightest intention of upgrading to Windows 10 (not for laziness or hubris, but because IMO the product is not worth the price). And if it was a critical system, than Microsoft Windows would not really be considered among the options.
* http://www.telegraph.co.uk/technology/microsoft/7898033/Micr...
* https://www.gov.uk/government/uploads/system/uploads/attachm...
* http://www.bbc.co.uk/news/uk-politics-24130684
* https://www.theguardian.com/society/2013/sep/18/nhs-records-...
* https://blog.venngroup.com/august-1st-marked-the-launch-of-m...
* https://www.gov.uk/government/publications/nhs-foundation-tr...
* https://www.theguardian.com/technology/2010/jan/22/internet-...
* http://www.cio.co.uk/it-applications/uks-largest-nhs-trust-d...
Yes, this is a large and complex subject.
Refusing to patch your system because of this is ridiculous (and yes some blame does lie with MS For pushing people to this)
I routinely disable services (until things stop working and I have to figure where I went too far) and luckily I'd disabled this one on my Win7 gaming box, even though the updates came through as well (I just manually vet updates, and have a bunch of them blacklisted for adding telemetry).
It's definitely not end-users either. There's a grocery store that just went up nearby that I saw Windows XP splash screen on when one of the cashiers rebooted. No joke, new store, Windows XP computers that handle money. Microsoft may have cultivated this nightmare, but it seems everyone wants to live in it.
Windows 7 is in extended support to 2020. So as far as I know security wise still up to date.
> There's a grocery store that just went up nearby that I saw Windows XP splash screen on when one of the cashiers rebooted.
The cash register may be even running with a user interface written in VB6. Don't attach it to an external network and it will work just fine. No need to invest in new hardware/software when you can get it old, working and cheap.
> Windows XP computers that handle money.
In what way do they handle money? A computer virus isn't going to steal paper money and the device operating the card reader should have been sufficiently separated to begin with.
* Predicate the commercial viability of your software on the basis of technological illiteracy
* Blame the technologically illiterate 'luser user' when things go wrong
* Try and profit from it even as you blame said 'luser user'
The best lesson for Microsoft would be if it incurs a tremendous loss to its reputation, and more importantly its bottom line, because of some issue like this.
It is strange to see people talking about how they took an exception and released a patch for Windows XP this time. Generally, such an exception is the very definition of CYA. If not, why don't they do it for all patches? Read: if the security hole can be used as a way to convince the 'luser user' to pony up more money, don't release a patch. But if the issue is so high profile (for example linking MSFT to a three letter organization), then better issue a patch and CYA.
I would rather see it used to leverage an opinion against back doors and surveillance culture but alas this is merely administrative incompetence and failure to either upgrade or airgap systems which have had a clock ticking on them and plenty of notice from the vendor to sort. The buck should stop at the trust IT directors as this was entirely avoidable with a properly managed estate.
There are at least 50 different releases of Windows 10 alone, and it's hard enough to find which is actually used.
The "System" dialog Shows "Windows 10 2015 LTSB". "Winver" on the command line shows "Windows 10 2015 LTSB build 10240" - but there are several releases of that and only the latest ones, e.g. from 10240.17236 and up have the patch - But I can't seem to find which one I have.
I don't doubt I have a patched version, but out of curiosity I'd just like to double check.
https://support.microsoft.com/en-us/help/4013429/windows-10-...
EDIT: Or KB4012606 / KB4013198 for older Windows builds.
Further, all of the major infections are based on Windows XP. Windows XP mainstream support ended a full year before the first gen iPhone was out! It's seriously ancient and there are very few excuses for people to have this crap on a network in 2017. For the folks who dont run XP, but got infected because they didn't patch? No excuses.
If I booted a RedHat (5.2 came out in 2009ish) or FreeBSD machine from 2009 without patches, and put it on the internet, I'm pretty sure it'd be hosed just as bad (shellshock, heartbleed, ?). the difference is, everyone would tell me I'm an idiot for putting a machine online from 2009.
As a tongue in cheek (but totally true) correction, FreeBSD from 2009 would NOT be vulnerable to the shellshock vulnerability unless you explicitly install `bash` and make it the shell used by apache-cgi.
By default, FreeBSD lacks bash.
FWIW, I do hold FreeBSD in high regard. It's just that expecting perfection security-wise from complex systems is a fools errand.
Servers are much less vulnerable for a number of reasons:
1) People managing and configuring them are more security conscious than the vast majority people. Come on, nobody downloads an email attachment or connects an USB they found in the parking lot to a server.
2) It's much cheaper to keep a server updated than a thousand Windows clients.
3) Like whitefish pointed out, even in the worst case scenario you can restore a backup and keep on truckin'.
But yeah, definitely. It's pretty damned unlikely that an OpenBSD backup server would get wormed, unless an ME exploit is involved.
The reason a machine might go unpatched is because it might support some critical hardware (eg medical) for which there is only one or two vendors and only a particular combination of HW and SW are supported (eg due to a specific custom hardware driver).
To lay the blame for this at a single vendor's feet is naive.
I hear tell that server wise NHS IT will also support OpenSUSE, and their record of keeping that patched is almost as good as their record for doing so with windows.
Policy controls, poor patching and user education are the root cause of the NHS problems.
This whole incident is really raising the profile of the creation of "cyber weapons".
They aren't like physical weapons with physical controls -- they are digital, controls and costs to copy/distribute are more like digital music than anything a Goverment organization is used to.
In that thought experiment, what could be the possible reason for attacking themselves so hard? Well, to give themselves more plausible deniability(and the whole attack would be done as an attempt to discredit the NSA)... but also to justify an agenda of technological sovereignty. Russia is in a tug of war with American corporations over where data is stored and they've even blocked the Microsoft owned linkdin. It's impossible to find an alternative to Windows(considering Russia is such a big PC gaming country), but who knows in 10-15 years.
The real question is why a hospital is still running windows xp even though it's not supported by its own vendor.
The answer is vendor lock ins. The upgrade is not a matter of simple command. Upgrade cost involves more licenses and hardware upgrades (which is not needed as old hardware is fine, but this is how things work between microsoft and hw vendors) it's like you need a new buy watch to apply dst summer time.
Also mirosoft and old school desktop software vendors used to make sure switch or upgrade cost is really high ex by using non stanard formats.. to lock users from switching to mac or linux
If you remember active x and internet explorer specific vbscript...
If you use free software from an expensive but decent vendor like redhat you can upgrade software on same hardware
And if it software was expensive you can switch to centos, scientific linux or pay anyone to handle that for you are fair rate. There is no vendor lock in. Every thing is stardard and no vendor lock in.
I see three areas where this event provides an
opportunity for Microsoft and the industry to improve.
Fixed version: I see three areas where this event provides an
opportunity for Microsoft, the industry, and
government to improve.
To be fair, he does go on to point out how this is partly the fault of poorly conceived government policies, namely the NSA's foolish practice of stockpiling exploits. But Microsoft and the industry should keep the heat on the government about this at every opportunity, because the horrifically bad and analogous idea of having government master keys is still being pushed forward.Perhaps EOL should be literal. The software kills itself and does not function.
The lesson I'm getting is our software can become malicious, and that malice can spread like wildfire. Is a company obligated to patch any wildfire type of bug forever? Is that a cost of proprietary software? Or is setting a date for its death the cost?
I think aging proprietary software has a much greater chance of becoming a weapon than it does becoming inconveniently obsolete. So forcing a company to release the code as free and open source software upon EOL date, I think just enhances the chances that it gets weaponized. There's a greater incentive to find exploits than to fix them, in old software.
Another lesson is most people really shouldn't be using Windows. If you can't afford to pay Microsoft to keep your software up to date, then use something that's FOSS and is up to date. (Same rule applies to Apple, if you can't afford new hardware in order to run current iOS/macOS versions that are being maintained, then don't buy stuff from Apple anymore.)
Would organisations with very conservative attitudes to upgrade paths or a requirement to run an older OS version have suddenly been patching nightly?
Would the exploits used have been identified and patched prior to their malicious deployment?
Would organisations with a vested interest in stockpiling exploits have elected to immediately notify projects' maintainers?
The answer to these swings wildly between 'maybe' and 'probably not', so the eventual endpoint is likely largely the same. It's a compound issue brought about by a chain of decisions made by disparate organisations, and using it as a stick to beat Microsoft or proprietary vendors in general with is missing a very important point -
Security is the responsibility of everybody involved, from vendors and the government, all the way down through to the people innocently opening infected attachments.
Most probably it's due to the high variety in kernels, versions and the subtle differences in linux distributions.
We know they have written such things as part of research. But still they continue to release software that is unfinished.
They have trained their users that failure to update is fatal. No doubt, if they are using Windows.
They also like to conflate "update" with "upgrade". They use these security problems in Windows to scare people into upgrading.
Windows 10, whether they like it or not. As others have noted, by design the new versions are not safer than the old ones.
Retroactively fixing reported issues does not make a new version more secure by design. They could just as easily fix the issues in the older version.
Can this company get anything right the first time? Will they ever design a system that is secure?
Do they have any interest in doing so?
Are they incapable?
There is nothing wrong with releasing something simple, secure and finished.
Does MS believe Windows users are not worthy of a secure OS?
I think Microsoft Research have contributed to development of L4 systems that run on baseband.
Do these systems have the same vulnerabilities as Windows?
Fixing problems after they occur (past problems) is admirable but other free opens source OS written by volunteers accomplish the same thing. The question is whether the design of the system is such that future problems are avoided.
Does Microsoft believe Windows users deserve more security? Can Microsoft deliver it?
All indications suggest the answer to both questions is no.
With no viable alternatives, no one can blame Windows users for sticking with it despite red flag after red flag, but it makes no sense to defend the Microsoft approach to security for Windows users. The company has no respect for Windows users.
Being responsive to a constant stream of reported vulnerabilities is an improvement from 1995 but as we can see it is not enough. Their software is still full of mistakes. They need to prove they can make something that is secure by design and that they are willing to do so for users.
(Truthfully, they probably do not need to do anything.
Quotes of 80% of Windows installations being tied to purchases of hardware are probably not far off the mark.
There is no selection of OS by most computer users.
A majority of users still get Windows pre-installed on the computers they purchase.
Microsoft could completely ignore users and it would not hurt their business, as long as they continue to maintain relationships with hardware manufacturers.)
Victim blaming at its finest.
Did the Microsoft President just confirm that NSA develop the vulnerability which led to the attacks on hospitals this weekend?!
Where did he do that? He said they found it and kept it for themself, but not that they injected it into Windows.
And about the whole thing, I would rephrase it to "many users learned the hard way about why are security-updates important".
But it is nice, that microsoft advocates a " digital genvue convention" even though I doubt anything will really change.
This is a bad analogy. The solution to people stealing your Tomahawks is to guard your goddamn bombs. A better analogy would be the U.S. military seeing Al Qaeda has a bunch of Tomahawks and doing nothing because they might be aimed at ISIS.
>A month prior, on March 14, Microsoft had released a security update to patch this vulnerability and protect our customers. While this protected newer Windows systems and computers that had enabled Windows Update to apply this latest update, many computers remained unpatched globally.
They stopped supporting Windows XP years ago, including with security updates.
There are still around 100 million computers around the world running XP.
It seems irresponsible to just leave them to hang out to dry when there are that many machines out there running it. A virus seems inevitable if they do. And shifting the blame onto the customers is not reasonable when there are still 100 million customers who are "doing it wrong" by not upgrading to a later version of Windows.
This entire article pertains to directly shifting the blame onto their customers, and the governments of the affected countries (!)
>The fact that so many computers remained vulnerable two months after the release of a patch illustrates this aspect
Again, XP systems are the most affected, and there was no patch released for XP. This is extremely irresponsible of Microsoft and this article shifting the blame onto everyone but themselves is reprehensible.
Customers like this is why we now have Windows 10 where you're force-fed updates and the OS will change under you instead of the change being an upgrade to a new major version that you can delay for years. (Which I'm not happy about, but I can see its benefits on that scale)
The best argument for Microsoft doing wrong here might be that they limit their (expensive) super-extended support to large organizations. Since they do the work, keeping a few boxes with special hardware patched should be an option for smaller shops as well (and is IMHO easier to defend than keeping a large network full of XP desktops running because ?)
The xp support schedule was available from day one. These companies knew exactly what they were getting into. Microsoft even extended the support period for xp on several occasions. It's galling that we as software professionals see this as malfeasance by the entities running xp still. They've had close to a decade to upgrade. Software is not a durable asset, it comes with an expiration date on the box.
This isn't just about security patches there are pieces of xp that fundamentally insecure, which is partially Microsoft's fault, but on the other hand the driver model which is one of the weakest parts of xp is the thing that kept many of these companies from upgrading.
Their product design doesn't emphasize security. For example, remember the extremely convenient AUTORUN.INF feature? That has probably resulted in billions of dollars lost and that number continues to grow every day.
Rendering fonts on the kernel... fantastic idea! What's the next great Microsoft idea? Continue to buy their products and figure it out.