Coupling the updates of single apps with the updates of the whole desktop or framework and libs, is just plain wrong. Having to upgrade the whole distro (including all the other installed apps you dont want to upgrade) just to install a new version of one single app you _want_ to update is a nightmare. Total bullshit. Users. Dont. Want. That. Users dont want one update to trigger another update, or even to trigger the upgrade of the whole desktop.
The blog post by ESR is one prominent example: http://esr.ibiblio.org/?p=3822
He basically wanted to upgrade just one (obscure) app, and the process triggered the automatic removal of Gnome2 and installation of Unity. Just _IMAGINE_ how nightmarish this must look for normal users. You simply dont remove somebodys installed desktop sneakily from under their feet. You simply dont. That feels like the total loss of control over your computer.
I personally had, during the last 10 years, people go from Linux (which I talked them into trying) back to windows, _precisely_ of this reason, of having to upgrade the whole distribution every few months just to be able to get new app versions. They dont have to put up with this insane bullshit on Windows, why should they put up with it on Linux?
This "distribution" bullshit is not what is killing desktop Linux, it is what _already_ killed desktop Linux.
The other reasons why desktop Linux never made it (no games, no preinstallations on hardware) are imho just consequences of the distribution concept and the 6-month planned-obsolescence cycle. Nobody wants to bother with something which will be obsolete half a year down the road. Nobody wants to develop for a target that moves _that_ fast.
Windows installations, once installed or preinstalled, run for a decade. Develop something, and it will run on a 10 yr old Windows your grandparents use. Most people encounter new Windows installations only when they buy a new computer. PC manufacturers know that customers will hate it when their new computer OS is obsolete within half a year and that they wont be able to install new apps, so they dont preinstall Linux, it's as simple as that.
If anybody _ever_ really wants to see Linux succeed on the desktop (before the desktop concept itself is gone), he will have to give up on the distribution concept first.
Even as a Linux nerd I'm constantly faced with problems caused by this. I'm stuck on Ubuntu Natty, for example, because it has the last stable version of Compiz that worked for me on the flgrx ATI drivers. If I wanted the latest version of Unity (I don't, I think Unity is terrible, but this is just an example) that means I'd have to upgrade Compiz and everything else and get stuck with all the horrible bugs new Compiz versions have with my drivers. It would also mean upgrading to Gnome 3 which still has many usability regressions (unrelated to Gnome Shell) and in my opinion isn't fully baked yet. I don't want all that shit just to get the latest version of a single package!
(You could perhaps pull it off with some PPA mumbo-jumbo, but you'd have to understand what a PPA is, luck out in finding a PPA in the first place, and messing with them can more than likely bork something up. Not something for the average-Joe audience Ubuntu is targeting.)
Shared libraries made sense in the days of limited space and resources. They still make sense from a few security perspectives. But from a practical perspective, Windows did it right by allowing programs to ship their own libraries and by doing backwards-compatibility right.
I feel like even the original 1984 Macintosh was a better alternative. Each app is contained in one file; it has no dependencies other than the OS. There is no need for installers/uninstallers. For updates, even Sparkle (annoying as it is) at least doesn't disrupt your whole system.
There are Linux distros that try to work this way, but the upstream developers have become lazy; they expect distros to do packaging for them. This leads to some skewed incentives; I think everything works better when the developer is also responsible for packaging, marketing, and support.
I personally LOVE LOVE LOVE that all my apps are updated by the same program. Instead of the Windows/Mac way of each app running it's own updater.
However, I do agree that the UI for upgrading a single app should be made better.
Just this week, I finally moved out of Unity while staying with Ubuntu 11.10. My workaround is to move to XFCE (Xubuntu). So far, no problems. Btw, I have Compiz enabled with the proprietary nvidia drivers. You might want to try Xubuntu.
It's amazing how easy it was to take out Unity and replace it with xubuntu: sudo apt-get install xubuntu-desktop.
The xubuntu people have taken the trouble to interface with all the Ubuntu plumbing (networking, sound, updates etc) via XFCE. Overall I'm impressed with the Ubuntu ecosystem.
I very much agree with the idea of a slow-moving core, though. That's how I've tended to use Debian: a stable foundation upon which I can install less stable higher-level software (the things I actually interact with). Of course, this doesn't address the distro fragmentation problem Ingo is talking about.
Optionally bundling specific versions of libs (or compiling in statically), and placing in user's home directory, setting path to look there first (maybe that's what you're saying exactly?)
Stop using the same tool for updating userland apps and system core specific stuff. Same app for updating "/bin/ls" and for "audacity" is, imo, at the core of the brokenness. These are different types of apps with different areas of responsibility, but we lump them all together in one tool and process.
Having multiple distributions is fine; someone just needs to make one with a package manager that gets this stuff right. Then your parents can just use that one distribution and never need to care about what other distros do.
In fact, Ubuntu wants to be that one distribution that is easy for normals. But as you illustrated, it still falls short...
Distributions freeze the set of available app versions to a specific version of the base system. You cant get a new app without upgrading everything else too, including all other apps. I cant simply get a new Emacs on my Ubuntu, because to do that, Ubuntu also wants to remove my Gnome2 and replace it with Unity.
Windows decouples the base from the apps. When you want to update an app, you just do it, everything else remains untouched. If I want to get a new Emacs on XP, it wont force me to simultaneously upgrade to 7.
Windows would have the same problems if they bundled a new set of base libs and 20000 apps that only work with that specific set of libs every few months but they dont. They invest a great effort into making the base system slow and stable and support it for a decade or more, and it shows.
And I don't see what's wrong with requiring users to have an up-to-date system. Security fixes alone make regular updates almost mandatory on all operating systems. Windows installs run for a decade, but they still get constant security upgrades, including ones that require restarts and big scary service packs.
System upgrades simply should be totally painless. The kernel gets updated constantly without users noticing, it should be the same for all upgrades. Maybe a rolling release would be a better solution, because it gets rid of the scary system upgrade user interaction, but they're more difficult to QA.
Maybe the repository administration model needs to be changed. Giving the devs more control/responsibility for their package in the repository might be a good idea. Many developers already set up PPAs to get there.
The problem are not the forced invisible security updates, the problem are forced user-visible upgrades.
When you want to upgrade one app, you maybe dont want to simultaneously upgrade another app or even the whole desktop. With the distribution model, theres no way to avoid this forced interdependence.
I want a working-for-me system. Breaking it to make it "up to date" is a bug.
You're assuming that an application or version that has been replaced by something newer is necessarily broken/inferior/etc. You're wrong.
Yes, some updates do address security, but many/most don't and even the ones that do don't necessarily apply in all situations. Not to mention that security isn't the only priority.
Security isn't the only priority offline, so why would anyone think that it would be online? Disagree as to online? Show me your car, residence, or person and I'll be happy to demonstrate.
One advantage of the distro model over the Windows model: you've never had a toolbar installed in your browser or your homepage changed because of a package you installed from a repo, have you? I find the Windows software ecosystem (at least the freeware portion of it) far more annoying than anything package managers do. Talk about unwanted changes to the user's computer.
The reason that everything is packaged together like that is so library updates and other things can be tested and vetted for compatibility. Part of the challenge of allowing such a variety of OS and system configurations is that library or software package maintainers are unlikely to have tested their updated code on every possible distribution out there, and so that makes it the responsibility of the distro maintainers.
Believe it or not in practice this doesn't create many issues. It might be an obstacle for casual users who are trying to transition into power users, and are trying to tweak their system. But most other users aren't encountering the same frustration.
How do you manage the 1000 packages and their libraries and dependencies? Many of them have separate runtimes which may or may not depend on the other packages runtime. How would you design an application sandbox to cover them all?
What you're saying basically is that linux is failed. At least for me, since I cant see another way to distribute and manage the massive amount of packages that sit on gnu.org.
I agree with you, and It is my opinion that the one who solves these problem is in for a lot of business-opportunities.
A small core set of libraries that change _very_ slowly and arent intentionally obsoleted every few months. Think of Windows like slow, stable and supported for a decade. Distribution of apps decoupled from the distribution of the base. Never make an app update trigger a lib update.
> How do you manage the 1000 packages and their libraries and dependencies?
You dont do that at all. Developers to that themselves like they do on Windows and OSX. Every dev packages his own app and puts it either into the App store or distributes it himself. You manage only the libs and dont allow them to change fast or in an uncoordinated, chaotic way.
> What you're saying basically is that linux is failed.
From the point of a normal user, yes. For a normal user, it is not an option. Everybody I personally know who tried it, went back. The main reason for most of them was the insanity of application management. (And lack of hardware drivers and games, but thats not Linux' fault.)
> since I cant see another way to distribute and manage the massive amount of packages that sit on gnu.org.
Decouple libs and apps. Dont change APIs and lib versions every few months. Make the base a very reliable and slow moving target. Dont force anybody to change everything every few damn months.
> It is my opinion that the one who solves these problem is in for a lot of business-opportunities.
The problem is already solved, at least under Windows and OSX. Thats why Windows and OSX get all the desktop business and Linux gets none.
_Programs under PC-BSD are completely self-contained and self-installing, in a graphical format. A PBI file also ships with all the files and libraries necessary for the installed program to function, eliminating much of the hardship of dealing with broken dependencies and system incompatibilities. _
[1]:http://www.pcbsd.org/index.php?option=com_zoo&view=item&...
Hmm, that's not exactly what happened, according to the link: "I upgraded to Ubuntu 11.04 a week or so back in order to get a more recent version of SCons."
Your overall point is well taken, but I wonder how much it affects what I think of as "normal users", who don't care so much about upgrading to the bleeding edge of scons. Consider a hypothetical user of Hardy, so they've had it for four years: what are they actually missing if what they do is web surfing, email, and maybe document editing?
Ubuntu's answer to that? Well, those aren't security fixes, so you can upgrade to $LATEST_RELEASE if you want the non-critical fixes. Ubuntu is trying to force a 2+ year bugfix cycle on software maintainers, and that's just not realistic for many small teams (both proprietary and open source).
This is a particular example, but I can think of other cases where this might be a problem. OpenOffice updates after a new MS Office release come to mind offhand.
You don't have to do all that to upgrade a single app. In fact you're thinking of it backwards. Distros mean that when you upgrade the OS or libs, you get new versions of the apps for free.
You can still configure && make && make install, or grab a statically-linked binary, or any other method of getting Linux apps to run.
You aren't serious, are you?
> statically-linked binary
Nobody builds them.
> any other method of getting Linux apps to run.
There are no other methods.
> Distros mean that when you upgrade the OS or libs, you get new versions of the apps for free.
That in turn means that if I dont want to upgrade the OS and the libs, I cant get new app versions. The collective refusal to acknowledge that this is a problem is what is holding back (aka killing) desktop Linux.
Evolution will never truly replicate the results possible with design.
I left my cousin with ArchLinux, he liked it alot, so much that he still used it 6 months later. At that time he wanted to install a new program, well, from then on he stopped using arch. Because he had to call me to help him fix his system, issuing the pacman -S programname failed, so he did pacman -Sy followed by pacman -S programname again, and this time his entire system was about to be updated, he answered yes on all questions... and suddenly his entire desktop was differnt from what it was before. All the programs get updated! He didnt want that!
This is people who think that a computer is broken if the taskbar on the desktop is accidently moved to another position by children. Of course they think they broke thier computer, and it did since their settings and gadgets on the plasma desktop stopped functioning, and some just changed. Without asking the user, just changed! That is a pure and simple WTF. Thats when I realized linux will never ever succeed on the desktop if it doesnt change fundamentally.
No. He wanted a new version of the app and upgraded to a new version of the OS (with a new set of default packages). And he got surprised by getting a new GUI, something which is rather odd because Unity was one of the most publicized features of Ubuntu.
> the process triggered the automatic removal of Gnome2 and installation of Unity
Not really. Gnome2 would still be there. Just the default UI is Unity. I'm more than a little bit surprised ESR had trouble remembering you switch UIs on login. I've been doing it since my Solaris (2.5) days. I loved OpenWindows.
> having to upgrade the whole distribution every few months just to be able to get new app versions
That's not really true - you have to do so because the distro publisher won't support the newest Chrome on their 2006 OS. It's ridiculous to demand them to spend their resources on your particular needs. If you are not happy, you can ask to have your money back. And even when the distro publisher doesn't want to add newer versions to an old OS, you can always add private repos maintained by the makers of your favorite software.
And, remember, having stable versions of software (even when a newer, flashier version, was made public) is not what some people want. I wan't my servers stable.
> This "distribution" bullshit is not what is killing desktop Linux
It was never much alive. Linux is an OS that suits a couple users well, but not most of them.
> are imho just consequences of the distribution concept and the 6-month planned-obsolescence cycle.
It usually took much longer to get a new version of your favorite Linux distro. 6 months is the current standard. And, again, there is no planned obsolescence. There are many alternative places to get newer versions for.
> Windows installations, once installed or preinstalled, run for a decade.
I don't believe we met, sir. Where planet are you from?
> If anybody _ever_ really wants to see Linux succeed on the desktop (...) he will have to give up on the distribution concept first.
I don't think so. In fact, most people don't think so. And, let me say that not thinking so works quite well.
You do realize the incredibly arrogant position you are taking. You purport to be the savior of the Linux desktop (do we need one, BTW?) and to have realized what's wrong with it and, best of all, you have the solution! Just do everything opposite to how it's been working for decades and all our problems will be solved.
Let me put it simply: when you think you are the dumbest person in a room, you are probably right. When you think you are the smartest person in a room, you are most probably wrong. And if you disagree with everybody else in the room, odds are you are really the dumbest person there.
Maintaining a distro is a lot of work, but until we can make software makers to agree on a single package format, a single way to manage configurations and a single way to organize the file hierarchy, the distro way will remain a very popular way to manage your computers.
Cross-distro app repositories are also a possibility, thanks to the Open Build Service (http://openbuildservice.org). And since MeeGo's community apps service is open source (https://github.com/nemein/com_meego_packages), all software needed for this (including an app store client app) already exists.
What is needed is a major distribution to make the first move on this.
Any way, care to give links to more on maeom/meegos architecture?
So if your package conflicts with something, either fix that issue or don't release for that distro versio.
How do I know this? Go to a developers' or tech conference, and what is the prevailing OS? 9/10 times these days, it's MacOS.
As much as Apple may be control freaks, they did desktop UNIX right and developers have flocked to them.
Meanwhile, the Linux desktop community is still having the same debates, the same difficulties it had back in the mid-90's. Dependency issues. Lack of compelling use case software; and that which does exist is versions behind current. Fragmentation.
The only areas in which the Linux desktop has moved forward is UI and user-oriented management, and Ubuntu deserves much of the credit for the latter.
Add to this the gradual movement away from the desktop paradigm to mobile. The desktop paradigm will still be around for several years (especially for creators--developers, media folks, etc), but the end-user is moving more and more towards smartphones and tablets for consumption. Extend those with external keyboards and monitors (think docks) for light creation work (documents, spreadsheets, etc), and you've the future.
Nope. The Linux desktop is dead.
This is really premature because there are major flaws with the other options. Apple is a closed-down walled garden (and it's expensive), for example.
The Linux ecosystem is extremely diverse, so maybe somebody will come up with something that can compete well with these other flawed models.
For example, maybe Ingo Molnar's post will inspire people to take things in new directions.
The desktop paradigm will still be around for several years
That's a ridiculous understatement. You can't do serious work on smartphones and tablets and that's not going to change until we're going around plugging them into projectors and keyboards, at which point they're just serving as desktops anyway.
I've been using Ubuntu for 4 years now, and the latest releases are more streamlined and usable than ever. The mainstream distros have come an extremely long way in being more user friendly to non-tech users.
Also I have mobile devices, but still end up using my desktop a whole bunch for two simple reasons: 1. when you actually need to type something of any length a full-size keyboard is wonderful 2. its nice not to have to squint at a smaller screen to see what's going on.
Then Win 95 came out and that had a decent desktop. I remember when the KDE people started talking about a desktop for Unix and people didn't get it, but when we saw the beta it was like... Wow!
Then Red Hat Linux didn't like the license of KDE, so they had to create Gnome. As a result, rather than having one good Desktop, the average Linux has two half-baked desktops. This fork has wasted people's energy and been a distraction away from an excellent experience.
Another example of this is sound. I don't know how many incompatible sound APIs exist for Linux now, I know it's more than the fingers on one hand. The consequence of it all is that often sound doesn't work and unless you're a crazy enthusiast you might never get it to work.
I was a Linux zealot until 2003 or so when I had a job that had me using a Windows machine a lot, and by that point there was Win XP which was a huge improvement over Win 95.
I still use Linux on servers, but desktop Linux has largely disappeared from my life. Every so often I try to install it here or there, but I typically find the experience disappointing. I was a Fedora fan for a long time, but Fedora became increasingly finicky about where it would install. I switched to Ubuntu, but every installation ends up having some serious problem.
For instance, Ubuntu installed just fine on my PPC Mac Mini with the exception that the fan runs full speed all the time and the machine sounds like a vacuum cleaner.
Windows and Mac OS have been on a general trajectory of improvement -- sometimes there are changes you don't like, but the overall direction is good. Linux did, after years of struggle, get a stable multiprocessor kernel (2.6) but other than that I get the feeling Linux has been going backwards not forwards.
When was the last time we had useful improvements to the OS X user interface? 10.4 (2005) or 10.5 (2007), in my opinion. They've certainly improved under the hood, but the improvements to the UI have been mostly gimmicks like Expose.
When was the last time we had useful improvements to the Windows UI? That would be Windows Vista, 2006/2007. (W7 was basically just a stable version of Vista). Windows is certainly attempting to add improve the UI with Metro, so it's a 5 year timeframe to wait for improvements.
OTOH, in Linux land we've had KDE4, Gnome3 and Unity all land in that time period. Every 6 months we receive useful new improvements to our UI. Sure, the initial reception to KDE4, Gnome3 and Unity were all negative, but the haters are always the loudest. I haven't tried Gnome 3, but Unity 12.04 and KDE 4.8 are both really nice, much better than the OS X or Windows 7 UIs, in my opinion.
And it's not just the UI. It takes about 10 seconds for my computer to leave the BIOS and have both Firefox & Emacs open in Ubuntu. It takes the same machine over a minute to have Steam open in Windows.
End-users don't want to have to learn an entirely new UI (read, a different way of doing things; or, "Where's my Start button? Everything I know how to do is under that.") every couple of years. Not because they're (all) dumb, stupid, or lazy.
It's because end-users view a computer as a tool to do what they need/want to do--quickly and efficiently. Anything that distracts from that (like having to re-learn where everything is, and how to do the task they've done the same way for several years) is a negative and annoyance.
Unfortunately, the technology community has forgotten that.
And you still keep saying that "sound doesn't work". It's literally been years since I've had problems playing sound.
> I get the feeling Linux has been going backwards not forwards.
Hmm, I can play most media out of the box on Linux. I think that's a huge step forward.
A quick Google search reveals this was the fix and was found in 2008:
http://ubuntuforums.org/showthread.php?t=1004899
Perhaps I'm some kind of genius at using Google.
As a linux nerd, I'm fine with command line + synaptic, but look at how well people have used the iTunes store, Amazon store, Google Play, etc... All of those have much less friction to find and download the right software than most linux distro's have. Ubuntu's market is close but...
WHERE IS THE NON FREE SOFTWARE?!!
If linux wants to do well for humans, paid, proprietary software NEEDS to exist on the platform. Ubuntu Software Center comes close, but it still kind of sucks.
Also, as a dev, it wasn't until VERY recently that you could even sign up to publish an app that was commercial in nature. It is hard to build a real marketplace when you're asking developers to give away all their work for free so that you can sell more operating systems (or support contracts) without the software dev seeing a dime.
As a software developer I can't feed my kids with free downloads on an open source operating system used by people who don't like paying for software.
Make it easy for devs to build software that people will pay for, then get operating system users who will buy that software for real money and you'll have fixed the Linux Desktop problem.
1) Distribution of software as a cross-distribution package that just has everything it needs inside a directory, libs and so forth. If you said this N years ago you were an asshole because "duplication of file blabla" and so forth. The typical example was "an user should simply go to some web site of some application, download a file, and click on it to execute the program".
2) Device drivers with a well specified interface between the OS and the hardware, so that different versions of the kernel could use the same driver without issues.
People complained a lot with technical arguments, about why a different approach is better than "1" or "2" from some kind of nerd metric. So the reality for me is that Linux does not succeed in the desktop because it is "run" by people with a square-shaped engineering mind. There is no fix for this.
But his reasoning breaks down when he says the relative dearth of commercial applications for the Linux Desktop is due to this issue. That's not true.
The main reason why OSX/iOS, Android, and Windows attract more commercial developers is because those platforms have a much greater installed base!
That's a perfectly understandable focus - even laudable, if you take the GNU line - but it makes it hard to release reliable, tested binary software. The landscape has too many variables.
I think this is a significantly bigger problem than install base. And I think it would be addressed with an appropriate focus on exactly the problem Ingo is pointing out: the lack of a defined, high quality core that can be relied upon.
A few years ago, Linux had a bigger installed base than both iOS and Android. Both of them outran linux with ease.
So the two situations are really not comparable. I still think the main difference is that you have to install Linux yourself. It's not even that installing is difficult--it isn't!--it's that normal people don't even realize it's an option. Your average random laptop buyer who just spent $600 on a laptop from Staples would be able to use Linux perfectly well if that's what his laptop came with--I suspect some wouldn't even realize it wasn't just a different version of Windows. But since his laptop invariably came with Windows, that's what he's going to use, not for any reason but inertia.
The last thing this discussion needs is troll comments.
... and therefore enables developers to make more money!
In my opinion it is one of the most important factors right here, combined together with a lot of marketing from companies which own the "platform".
Ubuntu is already trying to be more flexible with software center and applications, however open source is not single walled garden which can adhere to tight monotonous architectures found in mobile.
In an linux distro the apps are diverse, and programmed in multitude ways by all possible programming languages (from python to lisp to C) and environments. That is reality and life, and what must be done must be done by being aware of that fact. and No, no one can force the broad, diverse open source world to a tight control a la Apple.
This is simply false, there are plenty of complex apps with varying degrees of usefulness but they certanly don't reduce to 80's game clones and few screen apps. And even if it was true it's not a result of anything intrinsic to the app model. Also "tight control a la Apple" indicates that you missed his point.
How many times did you do make and sudo installed shit on your system to get basic app that should for all intents and purposes be walled of in a sandbox ? Even without sudo why should install script have access to my personal files without explicit permissions ? Did you run in to a situation where you wanted to install the latest version of an app for your system but it wasn't in your distro repo so you decided to build it only to find out that your repo GTK+ library was out of date and your options were rebuild all GTK+ packages or update the OS to alpha ? Even I gave up at that point - imagine the average user wanting to get the latest feature advertised on his favorite app site.
I use Linux desktop daily but it can be a hell and you need to know how to wrestle with the system, it's certainly not idiot proof - and it needs to be for mass adoption. But then again "mass adoption" (IMO) shouldn't even be considered a realistic goal, being useful to techies/developers and available on servers is a good objective too.
If I am writing "mostly", I mean "mostly" and if you refute "mostly" with "there are plenty of" you are missing even the basic tenure of a sane discussion. Check the top 100 apps in app stores/markets, and you will see how wonderful those sandboxed apps, you and ingo show us as examples of success. They are "mostly" my-web-page-as-an-app-now hype.
You miss Ingo's point, if you don't read his paragraphs, and carefully read that he compares Android/iOS core and apps to 20+ years of Linux kernel /desktop and software from various architectures, programming languages and technologies. He even claims their core is stable missing the point how young they are. The proud iOS core cannot go for more than few devices and the number of deficiencies/issues people had there are also interestingly high.
Finally, and if you really are doing all those compilations to get the latest version of a software, you might consider better considering your options of distro and package manager. Clearly you are doing something wrong there; and if you please mention what you compile on which distro and version Someone may able to understand what you are trying to achieve, and tell you what you should do instead.
In my own life the 'solution' I have found is to use FreeBSD; I get a stable, well-maintained core with a sharp distinction between core, userland and third-party (the ports system).
I have found the ports system to be a lightweight, agile alternative to GNU/Linux package managers:
When you install FreeBSD you are left with a kernel, standard UNIX command-line utilities and everything you need to hammer the system into a finely-honed tool.
Right now, I'm using it exclusively on my servers because I'm willing to accept the trade-offs of Ubuntu (beta 12.04 on my dev box, XUbuntu 11.10 on my netbook) on the desktop; a little instability and fully-automated updates is OK in exchange for not having to fiddle with graphics drivers, sound, Flash, etc.
Nowadays, any developer can create a packages for his application (and any of its dependencies) and publish it in self-hosted repositories. Users can easily add such repository to their system's sources list and bureaucracy problem's over. Well, some developers are doing this already - the only thing that keeps the rest of them is either ignorance or complexity of the packaging process.
I believe there's no need for an Android market clone (which is yet another centralized repository). What users may need is just a directory, pointing to external repos. Ubuntu market seems somehow promising (at least I remember seeing some dialogs like "you need to enable this source to install that package").
Content duplication is not a problem. A real problem is keeping the system up-to-date when you want to update some library, because of an important feature or bugfix. And it's nearly impossible with every application's bundling their own copy of that library, with some copies being actually incompatible forks (and a lot of copies being just different builds - think of different compiler versions - of exactly the same sources).
This is all compounded by the fact that there is no app bundle. Mac OS X has the bundle and a terrific way to install it: Drag and drop it into the Applications folder, just like we did in the days of the Mac Classic and MacPaint. It hasn't changed. (Well, it did for a while, but thankfully they went back.)
If you say, "Well, that means you end up installing 5 different versions of the same library on the same system" - Who cares? Disk space isn't a priority any more, and from a developer standpoint it makes a lot more sense to target a dependency whose version number I know, instead of a dependency whose version only the package maintainers know for sure. I don't want to be forced to target v1.3 of a library if it's only been tested (by me) on v1.2. It makes for much better application stability for the developer to be in control of dependencies and not these package maintainers.
I want to ship a self-contained bundle of awesomeness, not a dysfunctional shard among ten thousand other shards.
What I'd like to see, is the possibility of having multiple versions of software and libraries residing on the OS with ease.
I think these problems are surmountable.
So really, there is less friction than on OS X--I don't have to drag anything anywhere! In fact, on Firefox, I just download the package and it figures out to open it with the right program automatically. And then it just works.
Of course, if a program is in the repositories it's even easier, but that's a different story...
Nope. As simple as clicking a link with special URL scheme, like `apt+hXXp://archive.canonical.com?package=acroread?dist=feisty?section=commercial`
> This is all compounded by the fact that there is no app bundle.
I'm all for the bundles (which single-app repositories, actually, are!), but I want them to be non-monolithic (i.e. contain multiple separate packages).
I don't care about disk space — if I'm that constrainted with disk space that's probably another story that'll probably never happen to most ordinary users, having terabytes of storage. But I certainly care about bugs, and if libXYZ 1.2 has a critical one, I want my system to be free of that version ASAP.
And I don't care that you've never tested your awesome app with 1.3 — it's better to be possibly unstable than certainly unstable or, far worse, vulnerable.
A "collection" type for installing multiple packages transparently would be nice, but I have to point out that nothing prevents you from doing the exact same thing that you do on Mac OS X, which is to distribute the .so libraries in the same package as the application, or even compile it statically.
The problem is, this is horribly, horribly insecure.
If you add my repo to your sources.list, I can offer a "security update" to any app on your system, a binary which does anything I want.
Obviously, it would be signed by me and could be traced back to me, and I would never do that ... but it would still be a really bad idea to go around adding repos to keep up with the latest GNOME FartButton Free. Add 1000 repos, and the odds of a successful hack or a stupid packaging mistake breaking your system go up by at least that much.
Repositories must declare what packages (or, better, package name prefixes, like `foobar-*`) they intend to host, and package managers must restrict them from installing something not from this list.
Then you can, for example, host your own libsqlite3, but it'll be namespaced as foobar-libsqlite3 with some `Duplicated-By: libsqlite3 (tested with >= 3.7.3, <= 3.7.7)`.
[Added after some thought] Or, better, let's just namespace package names, based on DNS. I.e., a repository at sqlite.org can provide org.sqlite/sqlite3, but not org.kernel/linux2.6. Obviously, trusted repositories won't be subject to such restriction.
NextStep made a good stab at the problem with bundles (.app .service .framework etc) but once you moved outside of the abstraction layer you were right back into the mess. Most Linux distributions seem to want to emulate SunOS circa 1992 and any attempts to "improve" on the solution in drowned out by fundamentalism.
I actually though the author nailed it in the last paragraph of part 2 of his post. The free software movement needs to start looking forward and not try and emulate what worked 20 years ago. There is definitely potential for a brave organization that is willing to try and tackle the challenge.
Not all open software projects are receptive to changes, improvements or bug reports from strangers so nothing is really resolved without forking by which adds its own complications.
Of course there is still good open source / free software. It's just hard to come by.
Of course, on the proprietary platforms, the core programs--browsers, office, media...etc are all good. But that is true of Linux programs as well. And open source projects--even ones that are not terribly responsive--are still more responsive than most proprietary programs.
GNU/Linux distros, at least APT based ones, are perfectly distributed if the person wants to.
¹ If you have apt-url installed, which Ubuntu has by default
The only thing holding that back is the atrocious user-interface. Debian urgently needs to fix that and push their apt-infrastructure out of the 1990s.
In short: /etc/apt/sources.list must die.
This is how it must work:
apt-get install https://foobar.com/debian/squeeze/widget-1.0
A package installed like that must add itself automatically to a proverbial sources.list for future updates. Don't bother me with the housekeeping.Then add central indexing ('apt-get install widget' is nicer), a web-of-trust ("1234 users have installed packages by this author"), package-signing looks about alright already (except: no, don't make me run gpg).
This can (will and does) co-exist happily alongside the centralized repositories. Someone just needs to implement it and push it through the glacial Debian processes.
And while we're at it, there's no reason 'apt-get install github://foobar' can't be made work.
tldr; apt needs to absorb homebrew.
Provide a .deb from your website, use the install script to add your repository to the sources.list. There, the user doesn't have to know or care about that file. And this is all possible - no, easy - to do right now.
This is how it must work:
I disagree, having to run apt-get is too cumbersome to a regular user. But a better way already exists in the form of apt-url. Click a link, have the package downloaded and installed automatically.
This can (will and does) co-exist happily alongside the centralized repositories. Someone just needs to implement it and push it through the glacial Debian processes.
But my point is that most of the infrastructure (support for multiple repositories, one-click installation of third-party packages) already exists. That's why I don't agree that this is the problem with Linux.
And while we're at it, there's no reason 'apt-get install github://foobar' can't be made work.
I don't see how - there's no standard on GH projects for installation; some projects are installed by simple make/make install, others with easy_install/gems/npm, etc.
Launchpad does some central indexing ("other versions of this package") for PPAs, but I don't think it's accessible via command line. There is no indication how popular a PPA is, but PPAs are linked to admin user accounts which might help. Packages are signed and the key is auto-imported. Packages co-exist alongside packages from the repos, you can easily switch between different provided versions.
Installing a plain deb is much easier still, but you don't get upgrades (unless the repository is auto-added during install) or signing.
An excellent package manager that exists for 14 years already is being told to mimicry a primarily inferior set of cruthes? The world has definitely gone mad.
I admit that I do install about 90% of my programs as packages, but the problem of central authorities responsible for patching and distributing software and taking too long to do it isn't present in every distribution. I've used Arch Linux for years now, and it solves this problem by separating the packages into an 'official' channel of reliable maintainers testing and releasing new versions on the package system and a user repository where anyone can add packages.
To me, this seems like the optimal solution. Community-maintained packages can be promoted to official ones, from what I can see new versions are released from testing within days and if you're not satisfied with how others maintain the packages, building the packages from the newest versions yourself is almost as easy as installing binaries from the repository because anyone can use the build and packaging scripts used by the maintainers themselves.
[1]: http://0install.net/ [2]: http://www.openpkg.org/
Of course the problem is that the big distros (redhat, debian) can't be bothered to care.
Another Hacker News thread is discussing the new release of Audacity.
http://news.ycombinator.com/item?id=3714766
and it occured to me that I would like to try to compile a statically linked build of Audacity that could work on any version of GNU/Linux from Ubuntu 12.04 down to (say) CentOS 5.7. Just a big binary blob that I could copy and run.
How would I find out how to do this? I've compiled little things before (dwm window manager, qalculate)
I don't have any experience with RPM, but to build a package for a reasonably big part of the Debian-based world, you have to set up a build system (pbuilder/cowbuilder) and tell it something like `for DIST in lenny squeeze wheezy sid lucid maverick natty oneiric pangolin do; git-buildpackage ...; done`
The problem is, to get it right one has to find and read TONS of documentation.
Time and again I hear people in the lab complain about all these bugs in their applications, running Debian I can honestly say I don't have this problem. Ofcourse, I don't have the latest software either, but for me that's the price I pay for a stable system.
The ideas that are proposed are mostly things that have already been tried but have failed because of social/manpower reasons (they would take enormous amounts of effort and time from all involved for little benefit) or technical (they don't work)
A large percentage complain that the distro in question updates too frequently, when there are clearly distros that cater to stability (they just aren't the "cool ones").
Some of the complaints are demanding some mythical OS that allows you to install it once, never have it update or change but still have access to all the newest software. That would be wonderful. No one has figured that out yet, not windows, not mac, not any *nix.
I know bitchy comments aren't helpful, or likely to be well received, but there must be some other people of my ilk still on HN. Now that it's Product Guy News, where should I be going? Where is this story posted with people actually talking about interesting ways to improve things who actually understand the problem and could be called hackers without the technically skilled people laughing?
He's dead-on about the impassable problem of package updates affecting other packages. Having everything sandboxed with a general permissions system for directories instead of per-file is also better (this is how MacOS wanted to work before OS X). A free and open mesh network with reputation-based security is also the future.
But hey, linux is wide open, if these are the changes that are needed, we will see them.
Quoting android and ios ecosystem is well and good, but in those ecosystems the os comes first and the apps are developed downstream. You simply cant have that in a linux eco system.
For me, it's a perfect combination of a vetted ecosystem (ok, somewhat closed but closed in the way I like -- no crapware, all legit source distros and mostly mature projects) with the ability to go outside that system at any time, at my own risk.
Anyone can set up a repository to add to/compete with Canonical's, and of course they do. So with my willow garage repo's, I can keep up with their concept of what's stable, etc. It works nearly perfectly, IMSHO.
I assume MS and Apple have huge teams dedicated simply to making sure that all of this software works together nicely.
If there was a desktop distro that cost $100 a throw and that money was re-invested in testing the desktop platform more thoroughly and making it past and future proof that this would solve many problems?
Either that or he's essentially suggesting we move to static compiled packages, which while tremendously inefficient from a space and security standpoint would alleviate at least some of the headaches of trying to do cross distro binary offerings.
All it needs is mouse and keyboard support in the interface and higher-resolution apps, which will come for tablets anyway.
High-resolution apps are still somewhat lacking because the larger tablets haven't sold that well, but I hope this will improve.
The new Ubuntu for Android project will take over the world.
I believe the answer is, Linux doesn't have an IDE for Desktop development. Look at windows, they have .net Platform and IDE, Look at Mac, they have Xcode and Android uses eclipse for IDE.
Most of the application developers want and IDE, which is what is lacking in Linux. I hope someone makes an IDE for Linux desktop development.
What is wrong with the above?
Linux is a hackers operating system. When people try to make it into a desktop operating system it starts to suck (Gnome 3).
mode religious:on please g+ try somehow to make a good karma system to keep the NSR low.