Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Every binary distro I've ever used (SUSE, Debian, Ubuntu, RHEL and Fedora) came with a little GUI that let you install packages without going through a repository. Just download the .deb/.rpm and double click on it. This is how a lot of projects deliver their latest builds. If you're talking about a unified package format (it doesn't seem like it) then thats a completely different issue.

As for 'stand-alone' applications, that doesn't make much sense for open source stuff, but there's nothing stopping you from distributing your software that way. A lot of commercial software for Linux is delivered that way.



But everything packaged that way still demands to be installed as root, linked against their depgraph of ancient fucked-with libraries, and split apart into pointlessly separate packages.


It asks for your own password (ala Mac - that said, it could simply ask you to confirm ala Windows).

End users don't do linking or worry about dependencies.


Unfortunately this isn't used enough. It is not used by Firefox and Thunderbird. Instead they provide .tar.bz2 files.

I was pleasantly surprised by skype and chrome, who actually provide rpms and debs. Nice!

Open Office gave me a tarball with a myriad of debs. Not the nicest kind of packaging.

Unfortunately the distributions performs their own packaging of a lot of these applications which may give some people the idea that they should use the version packaged by the distribution. The distributions should really stop doing this and refer people to the official packages instead. Otherwise users will be stuck on old versions.

I also don't know how well this kind of delivery mechanism works with automatic updates. Do the external packages get updated in the same way as the packages delivered by the distribution or does anything like Sparkle exist for Linux?


I really don't understand what you're trying to get at, are you concerned with out of date packages? The whole point of package repositories is to avoid that problem, although QA testing and policies can make that non-obvious to users who aren't running some sort of 'unstable' option.

Some projects (Chrome I think?) release packages that actually add their own repository to the user's system so that the user can receive updates through their standard package manager. Other projects (Firefox) have a built in "check for updates", but then you end up back in the decentralized Windows/OSX update hell. That option is generally disabled by distro packagers for obvious reasons.

I've seen a lot of people citing the lack of package repositories and package managers as a weakness of Windows/OS X but this is definitely the first time I've seen it the other way around.


Yes of course I'm concerned about out of date packages. For example, my Fedora machine at work only has packages for Firefox 3.5. I tried using the tarball from firefox's home page, but for whatever reason it was really unstable.

I don't think it is unreasonable to expect to be able to install the latest version of such a major open source product as Firefox without hassle.

I don't know what you mean by the Windows/OSX update hell. Perhaps you mean the lack of a centralized place to look for updates. Here I agree that the situation is less than ideal, but I wouldn't call it hell. I'm sure the situation can be solved without relying on a centralized repository of all software.

The way Chrome does it may be the best option. Too bad it's not used by more projects.

The main thing I want is a clear separation between the core platform and the applications. Then the platform vendor can focus on the platform and the application developers can focus on their applications.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: