I'm guessing you didn't write this, but don't write this.
One, your IQ is irrelevant, it's an outdated way to measure anything practical. Two, patience is relative (and vague). Three, it makes you sound like a conceited jerk.
> We need to figure out what is going wrong, not just at the technical level, but at the social and political level
This is not a constructive criticism.
All in all this post is one of many examples of non-constructive criticism that plagues the open-source world. We know it's supposed to just work. We know it doesn't. Rants don't add value.
By the way:
> Its supposed to be a holdiay weekend. I'm not being paid to run these servers.
Then don't. What are you even doing? You're either working for free or expecting others to, both of which I believe to be a detriment to the free software cause.
This is a common misconception. IQ predicts a lot about a person's life outcomes; see https://en.wikipedia.org/wiki/Intelligence_quotient#Social_c... for a basic overview.
It is usually a faux pas to bring up your IQ online but that is not because IQ lacks predictive power.
Great you're smart on paper, now what? Maybe you get easily frustrated, maybe you aren't very dedicated, maybe you don't like things that aren't inherently interesting, etc. etc.
Use something objectively measurable. Be scientific. Don't be a copout.
The problem with traditional distributions today is that the packaging gives you something, but you are expected (particularly on a server) to take that packaging and then modify your system configuration to suit.
Unfortunately, it's then impossible for the packaging to cover every case on upgrade, since packaging can't possibly know what you did. Suddenly upgrade path code has to magically cover every conceivable use case and more. This route is doomed to fail.
Distributions are working on this problem, but it requires a huge paradigm shift that will leave behind traditionalists kicking and screaming. Take Ubuntu Snappy, for example, with its read-only filesystem and image-based updates. But this sort of thing is the only sensible way forward if you want upgrades to work without failure.
Alternatively, as a user you can take a different course. Make your deployment "immutable", and manage any necessary state independently. In other words, define your deployment as a delta to be applied over the distribution default. If things go wrong, don't try to recover; instead blow it away and redeploy. Manage your delta in version control, and make it easy to test and deploy. This is what the configuration management crowd are doing, as well as the Docker crowd, and just about everyone else.
Of course, this is a little crazy on a desktop system, which is why Ubuntu is doing Snappy.
Or, stick to doing things the traditional way, but don't expect your experience to change.
Many of those users, especially if they're new to Linux, probably just end up abandoning Linux instead of talking about their problems in more detail like the author did.
Even experienced Linux users are moving on. I used Ubuntu and Debian, among other Linux distros, daily for many years. But I too started noticing quality problems. So I've moved to OS X. It isn't perfect, but it generally gives a much better experience than I was getting with Linux, while still letting me use much of the software I was accustomed to using.
I may end up going back to running without an initramfs, at least for VMs that are going to last for more than a day.
The kinds of problems he's reporting are more consistent with hardware failure that exhibits only after a reboot, or with misconfiguration of his software, than defects in the OS boot mechanism.
I have a lot of respect for Linas (I used his Extrusion code to make the glextrusion xscreensaver) but I think in this case he's doing things to his machines that cause these problems, and it's not usually debian or ubuntu's fault.
The reliability and simplicity of 'getting to a shell prompt' out-of-the box for ubuntu also seems to me to be on the decline over the last ten years.
And going much farther back, I would say that inscrutability of boot problems might be at its all time worst.
I have ubuntu systems that will hang for 60 seconds on network failures, or sometimes just refuse to boot. It is very, very frustrating.
And, I'm not a linux newbie -- the first linux kernel I installed was 0.99pl14 on a floppy(!) based slackware system, and I spent years overseeing slackware, then redhat, then ubuntu systems.
I don't think it's a big surprise that CoreOS is so appealing; there's just an awful lot of magic and surprise baked into your standard ubuntu install right now.
Almost everything underneath your GUI (and often times, it even is your GUI, cough Gnome...) is written in C.
C is good for micro-systems where you do not want to implement all the abstractions higher level languages require. It is good for implementing low level functionality in other languages (like Python) or for writing a first-try compiler in on a new platform because of its simplicity.
It is not appropriate for an entire OS stack including tens of millions of LOCs across thousands of projects and a hundred thousand developers. At least in the modern age when we have everything from OCaml (1996) to Rust (2015) showing how to do bare metal safe and fast. Even C++ is moving towards a safe yet fast subset of the language where you should never use new / delete anymore.
Going forward, probably the most important revelation the free software world is going to need to go through is that for your own personal projects, being a whiz C expert that can hyper-optimize pointer math is great. But as soon as you start accepting merge requests, or even worse start delegating maintenance of your codebase across multiple people, C is going to cripple you.
Like I said, it is not C's fault. It is the dogma of Unix that the community holds sacrament - you write it in C, you use pipes, raw IO buffers, and to question that holistic view is to be opposed to everything about it, even if all you take issue with is the complications imposed by using C everywhere (OpenSSL, Systemd, the kernel, udev, NetworkManager, and in personal projects I have had to contend with deep buffer overflows / pointer misalignment / offset miscalculation in things like pulseaudio, SDL, Mesa, Wine, etc).
Maybe when the systemd developers pick their next slice of userspace to bring into the collective, they might possibly consider using a higher level language to implement it in. Not because they are bad developers, but because the code they write is not just about them. Think of how many headaches new work could avoid using something safer like D, or Rust, or even a restricted subset of C++.
Hardly. If anything it stems from a dual attempt at Linux user space devs to turn Linux into a merger of OSX and Solaris, while applying copious amounts of NIH-ism and second system thinking.
They are far too willing to throw away whole generations of software and concepts over some esoteric corner case or other, and keep chasing platonic ideals that will never stand up to an encounter with actual usage.
And likely when Torvalds steps down, and thus no longer hold the kernel devs to the "do not break user space" mantra, we will see the kernel fall to the same mentality quite quickly.
its CADT cubed.
It happens because nobody stopped to think about an interface, or because a configuration file or shell script does not handle an error correctly, or because, since nobody wanted to create a sane error handling system, an script has no option but just keep trying and trying.
I do agree that most of the software on one's computer shouldn't be written in C, but those are not the problems this migration would solve.
It has literally been the same for ten years. Ctrl-Alt-T or ctrl-alt-fkeys.
Hilarious, given that his complaints can be blamed squarely on the core (heh) part of CoreOS, systemd.
Ubuntu is a better experience than ever. Open source web browsers are better than ever. Maybe this guy thinks he's more clever than he is, and fucks around with too many things. Maybe he hates change. But, trying to be as objective as possible, this post comes across as whining for the sake of whining.
PS. I haven't seen such an ugly website since the 90's...
I reboot, upgrade and recreate machines on a daily basis. The Linux Servers and Desktops work pretty well these days, despite Systemd, Gnome 3 or Unity.
If you need days to boot a server, it might not be the server's fault... just saying...
I am honestly trying not to troll. Tried to read and had to give up.
I wish more people knew about Xfce (Xubuntu). It offers a better Gnome 2 -like experience than Gnome 2 ever did.
Although, the ouf-of-the-box theme, and look and feel, of Xfce/Xubuntu looks very dated. Changing the theme, desktop background picture, adding some transparency, is easy for a geek, and gives you a nice modern-looking desktop. But the default theme with gray and blue colors gives a somewhat Windows XP -like feeling.
This is what my XUbuntu desktop looked like a few years ago: http://imgur.com/BK2leWF
A brand new Debian install on an SSD with UEFI can get me to the desktop in about 4 seconds. Please keep it that way.
Just use Slackware. Or FreeSlack if you are a GNU zealot like me.
>Its spreading, too. Like cancer. Before 2013, web browswers worked flawlessly. Now, both mozilla firefox and google chrome are almost unusable. Why, oh why, can't I watch youtube videos on firefox? Why does Chrome have to crash whenever I visit adware-infested websites? What's wrong with the concept of a web browser that doesn't crash? Why does googling my error messages bring up web forums with six thousand posts of people saying "me too, I have this same problem?" When you have umpteen tens of thousands of users with the exact same symptoms, why do you continue to blame the user?
uBlock.
>I can understand temporary insanity and mass hysteria. It usually passes. I can wait a year or two or three. Or maybe four. Or more. But a trifecta of the Linux boot, the Linux dekstop, and the Linux web-browser? What software crisis do we live in, that so many things can be going so badly, so consistently, for so long? Its one thing to blame Lennart Poettering for creating buggy, mal-designed, untested software. But why are the Gnome developers creating unusable user interfaces at the same time? And what does any of this have to do with the web browser?
Gnome works really well on sysadmins workstations. No more cluttered taskbars any more.
We'll get the answer in few years :)