Wouldn't it be much more sensible to bundle all the dependencies together with the main package...

Wouldn't it be much more sensible to bundle all the dependencies together with the main package? That way you can keep free of conflicts, and devs know exactly what you run. The only issue I see is that packages may become big when they have tons of dependencies, but that won't happen often.

Other urls found in this thread:

gobolinux.org
twitter.com/NSFWRedditGif

quick example, i currently have 32 packages installed which depend on libpng (png image codec)
if they included libpng themselves, i'd have an extra 31 copies of libpng installed
if there's an update to libpng, i'd need to update all 32 programs, rather than just libpng

That sounds awful

-- moreover, how would you then deal with alternative dependencies?
sometimes there's isn't just one package which provides a particular dependency
for example, there are multiple available options to satisfy things like libjpeg or x264

>if there's an update to libpng, i'd need to update all 32 programs, rather than just libpng
I don't see how that's how it would work though, the deps are there to make the package work. If there's a new version of the software they may ship it with the new libpng or the old one.

That'd be up to the devs.

>Let's create a central point of failure that fucks the entire entire thing is fucked if the main package is fucked.
Also, as you said, packages becoming too big is a major issue. Why create an issue when the current system works just fine?

Yes.
For non-security critical applications that's a good idea. I endorse static linking especially.
>size
People blow it out of proportion. And people who have this issue are likely people who write code poorly because they've clearly used libraries where they shouldn't. Big applications are niche.

>I don't see how that's how it would work though, the deps are there to make the package work. If there's a new version of the software they may ship it with the new libpng or the old one.
most of the time, minor updates (bug/security fixes) are non-api-breaking, meaning they're compatible with the previous version
for example, libpng is currently at "1.6.34" on my system, the api covers "1.6.x", meaning a program designed around say, 1.6.2, will work with say, 1.6.12, without needing to update that program or make a new package
with what you're suggesting, if libpng updated from 1.6.2 to 1.6.12, the program you installed will need to be repackaged and reinstalled just to benefit from the update to libpng. if the program is say, 100M, you've just blown 100M of bandwidth to update a 0.5M component, and there could be many programs

>That'd be up to the devs.
exactly, right now it's up to me, the user. this is a benefit to me
if i want to use say, mozjpeg with every libjpeg program, i can. if i want to use the 10bit version of x264 with my programs that use x264, i can, just by swapping out those packages alone

Isn't this what Stali is?

>>Let's create a central point of failure that fucks the entire entire thing is fucked if the main package is fucked.
this is the bullshit doubletalk that every hardcore open source fanatic says and is the reason why 'the year of the Linux desktop' will never happen
>huuur single point of failure is bad, lets spread the point of failure across the entire system that way the entire system wont be affected duuur

pretty much my admin life on windows

>use 20 packages with same dependencies
>get all the dependencies included with the package every time
why not, everybody seems happy about Steam games bringing their own redistributables every fucking time.

Actually the problem there is more involved. The problem is that for windows you have a component based servicing. You can read up on it but it's the entire reason windows takes forever to update anything or even check if it needs to update. You can open up the same end user directx setup and it'll take forever both times. They don't even attempt to optimize that crap.
On a more sane platform it'd take seconds at most. I'd consider seconds a stretch even.
Also user has a silly idea of installing single-program dependencies to a shared environment. That's a bad idea.

where do you draw the line?
even windows programs have external dependencies sometimes, namely things included as part of windows (like directx or .net), or things like mono/python

on my machine, i have 3 things that depend on mono. mono is 200M alone, that's an extra 400M of disk space used if they included mono
or how about something like gtk3? it's 70M, and i have 17 things which use it, that's an extra 1.12G of disk space if the programs included it themselves

This only affects proprietary software and fuck those.

gobolinux.org

it's not specifically a proprietary issue, it's an issue for programs that are "release and forget", things that never get maintained past their "shelf-life"
... which is most proprietary software
the other relation to proprietary software is that not just anybody can maintain it, so when it stops getting maintained, that's it, it's "done".

Isn't that what Apple does, basically just static linking?

mac os applications aren't neccesarily statically linked
they appear to be one file, but that's not the case, an ".app" is actually a /folder/, it just appears as a file/executable in finder

That's the point of docker

>if there's an update to libpng, i'd need to update all 32 programs, rather than just libpng
Not of they keep the same ABI. Debian is good at that for example. Arch doesn't and you need to reinstall every package that depends on the library.

And Mach-O (macOS' executable format) also supports dynamic loading.

i'm using arch, and that's not the case, i don't need to update anything but libpng where there's a non-breaking update to libpng
programs that depend on a non-current abi instead use a more specific library path (such as "libpng12.so" instead of "libpng.so"), and libpng12 (1.2.x) is installed alongside libpng to provide support for the older abi
i have one program (steam) which depends on libpng 1.2.x, it's still better to have only 2 copies of libpng than 33 copies

>Debian is good at that for example
It really isn't. I've lost track of how many times executables are complaining about wrong versions of libraries and glibc and I have to manually rename 32-bit versions to canonical paths.

Arch even tells you which packages are just increasing version vs which are actually getting updates.

you're describing a different thing, namely package vs. software updates
the "-1" in say "1.6.34-1" is just the version of the package itself, not the software
an update to "1.6.34-2" for example, just means something about the package changed, but the software is the same, this might be related to the dependencies, but not necessarily
most if not all distros have an equivalent, it's not specific to arch