Static Linking vs. Dynamic Linking

Which is superior, Sup Forums?

Today: Static Linking.
Because you have no idea what stupid shit will be done to some library you linked in a few years.
With the way cunts like GNOME act, it's almost certain that unless you're contantly rewriting shit the keep up with their changes, your program won't even compile after a few years.

Static. Dynamic linking's problems don't make up for its meager benefits.

Depends on the licensing scheme otherwise probably static if it doesn't cause any problems.

only making dlls because the third party license demands it

Dynamic linking for memory starved environments like embedded systems

Static, obviously.

Static if you're some stupid proprietary cuck.
Dynamic otherwise.

static is better in all cases.

>Have to recompile everything if a library changes
No fucking thanks.

Symlink (static+dynamic) linking.

That's why version numbers and changelogs exist, you retard

What does that have to do with anything?
If it's statically linked, you'll need to relink your program (which almost certainly would just be a rebuild).

With dynamic linking, that's where you would give a shit about API and ABI instability.

That's fucking retarded.
If the ABI hasn't changed, you just need to relink.
If the ABI has changed, you need to recompile everything.
This is the case whether you're linking statically or dynamically.

Sure, but not every library is going to break their ABI every release.
In fact, most serious libraries try as hard as they can to not break it.

dynamic is technologically more advanced, complex and challenging
static is all you need in 95% of cases

better library decisions for embedded systems

should I increment semver when applying newer versions of dependencies if my code doesn't change? asking for a friend

Why would you need to outside of security updates?

Depends on your use case

If you want a nice little self contained binary, static is better

If you're using containers (lxc, docker, flatpak, snap, plain old chroot) it doesn't matter but dynamic produces smaller containers

If you need to depend on one of several libraries that expose the same ABI (eg. OpenGL, Vulkan, etc...) you need dynamic (though I'd recommend dlopen instead of compile time linking for that)

don't use 500 libraries you don't need then

Not linking at all.
Well designed compilers and languages can load, optimize and and emit billions LOC in seconds. Stuff like LTO shouldn't even be a thing. Linking as an external step is a retarded path dependency and the people back either didn't have enough RAM or should be glad I don't have a time machine and a coat hanger.

Now, don't make the mistake that many would make about creating plugin interfaces and choose scripting shit.
Despite what lots of people did in the past and regretted later, creating software extensions by scripting languages is the worst solution. Dynamically loading binaries is still better than that, but the best is a high through-put, low-latency OS IPC.

>dynamic is technologically more advanced
Doing things on the wrong end isn't technologically more advanced. Stuff like
- LTO
- PGO
- JIT compiling
is really just damage control.