Non-technical users want a stable system that doesn't constantly change. Windows loves to hit you with popup notices or UI changes where if you take the "ok" route things about your system change, or you accidentally click a notification that comes up and suddenly your browser is changed or worse. They also don't want to turn on their machine and have to wait 30 minutes for an "important update". They just want to turn their machine on and have it be exactly how it was yesterday.
A non-technical person I know thought he was hacked and wanted to buy a new computer because a shortcut icon was moved from his desktop to his recycle bin and he thought someone wiped out his computer. Now in Microsoft's defense they didn't delete his shortcut, but that's the type of mindset non-technical folks have. The slightest change is a catastrophic event.
...and you're recommending Linux Desktop for that? An operating system famous for breaking compatibility with itself every 2 years or so? Not to mention between distros. I mean, I suppose you can just leave them on some arbitrary un-updated version of Ubuntu forever, but the same could be said for Windows 2000.
Certain programs they run won't work on old operating systems like Windows 7 or 2000 but they work on Windows 10 and Linux. Running an unmaintained old copy of Windows 7 would be pretty bad for someone non-technical because if they decide to ever go-to a questionable site of their choosing, chances are they will get themselves in trouble with a virus / malware.
You can run Xubuntu 20.04 LTS for a few years with unattended updates turned on so they get automated non-UI breaking security patches. Then when it goes EOL upgrade to the next LTS, or even turn on unattended upgrades too with a stipulation that something might change once every few years instead of twice a week. It's a really good environment IMO.
One could have installed mint mate 16 in 2013 and painlessly updated between versions without huge difference between then and now.
Linux isn't a product like windows it's an ecosystem use whatever works for you.
I say this as a sysadmin responsible for maintaining a small fleet, the majority of which are Linux boxes. I've wasted far more hours helping end users with OS-level problems on Windows and, to a lesser extent, macOS. This is in spite of the fact that, again, the majority of the machines I manage run Linux, and all of these users are similar in technical capability.
Give it a try yourself. I'm sure a lot has changed since you last used it. Oh, and make sure you pick something that holds your hand a bit like Manjaro, Pop!_OS, Mint, or EndeavourOS.
That's what everyone working in IT should understand. No, you can't "always push an update". Always treat your every single release like your last one because there will be many people who will stick with that version.
I'm joking but secretly hoping it happens.
Graphic designers, digital art types, and musicians are all fringe and mostly on Mac anyway. PC gamers are a fringe crowd. Practically all productivity software has migrated to the cloud. Desktop Linux usability seemed to meet the standard of Windows XP and Windows 7 years ago.
Is the problem really that you still have to grapple with graphics drivers on the CLI? Or is it just that Best Buy and CostCo still don't have motivation to put shiny Linux machines on the shelves?
and yeah I know all about the various shady Gates/Ballmer-era OEM deals… I just can't believe that not one OEM has snapped by now.
That does not bother me or probably most of us on HN much, but for most people it’s like periodically needing to muck around under the hood of their car. It’s too much, they just want to drive and take the vehicle in to get oil and brake changes every once in a fairly long while. Anything more is tedious overhead.
The situation with proprietary drivers still kinda stinks too. The distros I’ve seen handle it best are Ubuntu with the proprietary drivers control panel and pop!_OS which just sidesteps the issue altogether for Nvidia hardware by shipping an ISO with non-free Nvidia drivers included, but the rest just leave you with a package manager, terminal, and whatever you can scrounge up on the internet. Yeah it’s technically the responsibility of Nvidia, etc to fix that, but it’s negatively impacting adoption nonetheless.
I've setup Linux on all machines I own or maintain, for all users including my wife. The frequency of hickups is not higher than it was with Windows and It's easy to roll back to a known good configuration, much easier than with Windows. I never looked back.
Edited to add: The last machine I bought was a Purism laptop. The first machine which did not have Windows pre-installed.
Bullshit on distros like Solus OS.
Sure, if 1.5 Billion people are a "fringe crowd"
https://www.extremetech.com/gaming/314009-3-billion-people-w...
“The year of the Linux desktop” always feels like just a couple of bad big tech decisions (and an absence of good decisions) away.
As far as Linux on ________, Let's look at where we're at now.
Supercomputer. Linux runs the entire Top 500.
https://itsfoss.com/linux-runs-top-supercomputers/
Severs/Websites. 75% are Linux or variants.
https://w3techs.com/technologies/overview/operating_system
Mobile. 72% is Android alone, which counts for Linux IMNSHO.
https://gs.statcounter.com/os-market-share/mobile/worldwide
ChromeOS now exceeds MacOS market share. Apple had already been making (painful) attempts to move away from the Desktop.
https://arstechnica.com/gadgets/2021/02/the-worlds-second-mo...
And, of course, there's the Linux Subsystem for Windows.
Linux won. It's over. Windows might be holding onto a majority if you only count desktops, but given even Microsoft has moved towards supporting Linux directly on their own OS... and moving towards Cloud As a Service and hosting, yep, Linux on Azure... Even Microsoft knows that Windows has found itself facing the once unthinkable: support Linux or find yourself increasingly irrelevant. Windows on ARM failed. Twice. Windows Mobile in all of its variants are dead now. Microsoft released an Android device. So given everything that we use today, something that ISN'T running a *ix type kernel is, in fact, the minority.
Microsoft is grasping harder with Windows 11... and in their shortsightedness, they ended up excluding huge chunks of systems that were still being sold even as recently as 3 years ago. It's quite unthinkable to me because the one and ONLY one major killing feature of Windows in general was that you could install it on decade-old hardware. Win10 ran pretty decently on my Phenom II x4 desktop which had 16GB Ram. Why shouldn't it?
And now Win11 is looking to exclude first gen Ryzen.
https://www.extremetech.com/computing/324157-windows-11-may-...
The tighter the grip, the more that will slip in between their fingers...
At this point, it's becoming easier and easier to support Linux... thanks to Proton on Steam, more native binaries (BlackMagic, for example, with DaVinci Resolve), more apps becoming webapps and only needing a web browser. At this point, I think what we're waiting for is for legacy companies like Adobe to shit the bed and render themselves irrelevant. (And boy have they gotten close.)
Yeah. The year of Linux on the Desktop is a meme... but should it actually happen... the concept of 'desktop' won't matter anymore, IMHO.
Due to heat and therefore power constraints an ipad shaped device will remain sub par compared to a PC shaped device.
I'm not saying anyone has to like it, but there's an inherent advantage in not having to support other hardware of varying degrees of quality. Honestly, I'll take that form of vendor lock-in over frequent pointless changes, having advertising shoved in my face, and being forced to have an online account. Perhaps my opinion will change if Apple follows suit (they won't), but for now it was pretty easy to set up my M1 Macbook Air without signing in with an Apple ID.
Is this a real question? Many of the edge cases, both high cost and low, would disappear for Apple.
Additionally, all the tech support will fall on them.
Lastly, any bad user experience (cheap device with crap battery, low quality hardware that doesn't match the software experience, etc) would have a negative effect on them as a brand.
I've been on both sides of the fence, worked for a Linux company for a few years, and I'm very happy in the walled garden.
The software component in a Mac or an iPhone is larger than in a microwave oven. There is still equally zero incentive to port it to third-party devices, because it's the devices that bring in the revenue; software is a cost center.
* The FTC actually does its job and takes action against Microsoft for antitrust violations.
The web as an open platform is a dev manifesto; for everyone else it's just a glorified entertainment and information hub, and making it safer and easier for them is what the market wants. Not more openness, but less of it, because less openness means less cognitive mode. They don't want to think about 100 ways to do the same thing, each with a different license and complex venn diagram of incompatibilities. They just want to get on with their day.
"If something goes horribly wrong"... even in that case, the vendors are less likely to fuck up than most users. Google/Apple/Microsoft clouds are much better at keeping data safe against device failures and ransomware than local Windows installs managed by average users ever were or could be.
If anything the future is really dumb computing, where the internet is just another appliance not too different from your radio or the television. Apps with corporate content hubs, not open platforms.
Computing as an open platform was due to the industry by and large being created by engineers. Now with mass adoption, we're seeing a switch to producer vs consumers, with different paradigms/devices/needs for each, like the differences between magazine publishers and readers. Readers don't care what software was used to create a magazine, they just want to pick it off a newsstand and read it. Same with digital entertainment; the underlying stack shouldn't be their concern if their intended usage is simply content consumption. Windows adds only unnecessary complexity to their usage. Stallman is not an average user, and it would be a massive disservice to humanity to design for the average user as though they were Stallman.
Not all freedom is beneficial. Sometimes it's just yet another useless decision to have to make in a world already overflowing with excess information. The human brain did not evolve to make careful cost-benefit analyses for every trivial thing in a post-internet world.
Even devs are moving towards serverless. Content creation might eventually move to "OS"-less, where content creation is moderated by walled hubs like Adobe apps on the iPad and developer experiences happen in virtualized clouds with web-based IDEs. Bare metal appeals to engineers, but for everyday users and developers, again, it's just excess cognitive load. Please don't make people think about useless crap. There are already infinite upcoming crises -- of the global sort -- for anyone born in the last few generations. Computing trivia is just... trivia, no more inherently interesting than the proper type of lubricant to use on the machines in the factory that makes their toaster. Don't make them think without good reason.
Any of these possibilities would be nice:
* The market presents a kinder (and equally-affordable x86) solution and users follow.
* ReactOS lurches leaps and bounds ahead.
* Microsoft realises this trajectory is wrong and rows back.
* Everyone wakes up and realises that Stallman was right.
For the foreseeable, none are likely. What we really need is something to go horribly wrong that demonstrates practically why remote control and data collection of your own system is a bad idea. That will be the only way the point is driven home for the passive masses, unfortunately.