Linux is literally such a piece of shit from every perspective...

Linux is literally such a piece of shit from every perspective, but especially an engineering one; it's embarrassing to see people advocate for others to install it.

Not only does nearly every desktop distribution have nightmarish usability issues, but every linux machine has to interoperate with the single worst production kernel code-base in the history of computers. A lack of software and driver development is one thing. A completely toxic architecture and approach to dealing with security issues is another. It's embarrassing how much shoving it took to get even ASLR added to the kernel, and security bugs are treated by Linus and his dick-suckers as a natural byproduct of their absolutely shit code and practices, as nothing more important to them than any other bug.

Not only is the kernel itself completely shit, but so is the ecosystem built on top of it. X11 is widely understood to be a massive piece of shit but nobody talks about how wayland just leaves font-scaling up to individual applications, and the door open to further future fuckups in the complete shitshow of incoherent desktopland. Nobody fucking knows if they want to continue the long-fucking-dead ritual of "UNIX principles" or if they want systemd to be able to brick peoples' EFI partitions and reimplement the RM command for whatever fucking reason.

I hope the new firefox busted a whole bunch of your ricing styles because you fucking deserve it for shilling for and using such an incoherent piece of shit system. Linux is the single largest embarrassment to software development since the Java platform and it's not going to get better. Abandon ship sooner rather than later.

Other urls found in this thread:

lunduke.com/2017/11/15/linux-powers-supercomputers-all-of-them/
en.wikipedia.org/wiki/SW26010
twitter.com/SFWRedditVideos

Is this a pasta?

I switched to macOS at home, and every server and workstation at work runs some version of Ubuntu.

I haven't been happier.

>It's embarrassing how much shoving it took to get even ASLR added to the kernel
Linux was the first kernel to support ASLR, beating Windows by almost 2 years (2005 vs 2007). On Windows it also only works for executables and DLLs that indicate that they support it, whereas on Linux it was mandated from and with the 2.6 kernel.

>all this flaws and still works better than windblows

no i just wrote it because i felt buttmad about life after busting my phone at a concert. we all gotta vent once in a while you know? i dont mind if you copy pasta it though

You have no idea what you're talking about. OpenBSD used ASLR first, and their developers were the first to push for other operating systems to use it.

>OpenBSD used ASLR first,
You are correct, but that still doesn't mean that I "have no idea what I'm talking about", which is a ridiculous hyperbole. THe other information in my post is obviously correct.

The kernel is the least worst of desktop operating systems. NT is probably the worst one - you can cause bsod by resizing a window or hang it by opening a link in a browser.

Didn't read lmao

Thank you, Based God.

i dont care about the rest of what you said, because it has nothing to do with my post. i agree that windows is shit but idk how linux mainlining ASLR before windows does has any kind of bearing on the actual history you could look through yourself on the LKML of actually trying to get the feature on the system in the first place. it took ~2 years, from what I remember! One of the bigger issues was that the status quo on the kernel didn't see it as important, which is specifically what I'm talking about... and you didn't touch on or respond to whatsoever.

Top 500 supercomputer market share

......huh?

GTFO, shill.

>no drivers and no professional software
This is obviously a lie, as not only does Linux run on infinitely more hardware than many other OSes, there's all sorts of embedded architectures and server platforms that run Linux that have specialised driver support.

>i dont care about the rest of what you said, because it has nothing to do with my post
So let me get this straight, you replied to my post and then you claim that my post is irrelevant because your reply to me is irrelevant? WTF?

>One of the bigger issues was that the status quo on the kernel didn't see it as important, which is specifically what I'm talking about
But it isn't that important. Linux has supported PIC/PIE since forever, so security-enhanced Linux versions had their own address space randomization techniques.

>and you didn't touch on or respond to whatsoever.
That's because you replied to me, not the other way around.

lunduke.com/2017/11/15/linux-powers-supercomputers-all-of-them/
Linux powers all the supercomputers on the top 500 list. All of them. No IBM, no Solaris, no BSD, no Windows, all Linux.

>no drivers and no professional software
I didn't claim this. and i dont really care to so dont think that's what im trying to say anyways. this is simply a common complaint. and when it comes to the professional software thing at a level I sympathize with the sentiment because there's a lack of UX consistency and interoperability between what desktop software is on the platform, or at the very least, it's far from the level of quality that's seen on other platforms. obviously, it could be better

>aslr wasnt important because linux supported pic/pie and then all the hardened linux forks could add in their own aslr
........why not just be secure by default? this is classic linux bullshit: if you flip *all these switches*, don't step in the wrong spot in this minefield, and remove the beartrap we left in your chair, you'll be fine!

>the rest of the shit in your post
are you autistic? you send me some shit about windows that barely has anything to do with what i said and then sperg out when i say i dont care about the shit you say that has nothing to do with what im talking about.

So what alternative are you proposing? Because I used windows for years and fucking shit that only get worse with time, OpenBSD is ok but not my cup of tea, so what else am I suppose to use?

>I didn't claim this
No, OP did: A lack of software and driver development is one thing

If you're too stupid to understand, I'm replying to OP in the first part of my post and you in my second part.

>........why not just be secure by default?
Because not everyone runs a fucking server. Linux is supposed to work on even minimalistic embedded processors.

>you send me some shit about windows that barely has anything to do with what i said
I was replying to OP. You replied TO ME, you dumb fuck.

....ok?

i dont think there's a good alternative right now. there are systems i personally like but obviously like a lot of you im not a fan of proprietary operating systems so i generally try to keep away from windows/osx. desktop users shouldnt change what they're doing right now desu, they should just use what's comfortable for them. developers on the other hand have a lot of work ahead of them that they don't have the presence of mind to even acknowledge

I am op dipshit

@63413466
>I am op dipshit
Then why are you claiming you didn't write stuff you clearly did?

You're obviously just trolling and baiting for replies.

>A lack of software and driver development is one thing.... architecture and...security issues is another.
its called subtext, autist

when i say "X is one thing, but Y is another" I'm talking about Y. I don't care about X. X could be hyperbole for all I care. this is simple english. idk why this is so hard for you LOL

>no argument
>in a post this long
Found the Rust shill.

>its called subtext, autist
I may be autistic, but it doesn't help that you're incoherent and outright lying about what you wrote.

>I don't care about X
You clearly mentioned X as "one thing".

>X could be hyperbole for all I care.
And I clearly pointed out that hyperboles were ridiculous in and you completely sperged out over that in

lol ok dude you win. my post was actually about how bad gimp is and how theres no good driver for muh nvidia 1080

>my post was actually about how bad gimp
I suspected this.

>is and how theres no good driver for muh nvidia 1080
That's a lie though, Nvidia releases excellent drivers for Linux.

...

I don't know what your reaction pic refers to, but if it is an attempt at being """ironic""" about the driver statement, you are clearly mistaken (or not updated since 2006).

Nvidia is pushing their hardware and software frameworks heavily for academia and supercomputing, and in order to do that they are releasing all of their state-of-the-art in their Linux drivers.

Not OP, but supercomputers is a market where quality doesn't matter.
It also shows in the hardware, the fastest computer's CPUs are rather dumb when it comes to executing conventional software.

>desktop users shouldnt change what they're doing right now desu
Sadly for many of them this can't be done because win10 is this shit, people turn to Linux not because they hear good things about Linux but because they can't stand Microsoft's bullshit anymore.

>but supercomputers is a market where quality doesn't matter.

yeah and then they get disappointed and make idiotic threads on Sup Forums, right? and so on and so on

>jvm
>embarrassing
go on

Nice argument, fag.

meh, suck an egg autismo.

You're speaking out of your ass. Quality obviously matter in HPC, otherwise it wouldn't be fucking called HPC. These supercomputers and computing clusters all use bleeding edge hardware and proven techniques for optimising calculation speed and data flow.

t. someone who's currently doing a PhD in HPC

you mention just the engine that java runs on but have you ever used the shitshow known as ANT? or Maven? have you ever inherited java code from other people in your life? Have you ever worked at a company that's either lost data, customers, or worse, to mistakes that java so happily allows you to make?

I have. and the only argument against this kind of shit is
>buhhh if you write it correctly it's not bad
This is the first sign of programmer brain rot

Saying that JVM sucks because Ant is like saying that Linux sucks because Makefiles.

me:
>Linux is the single largest embarrassment to software development since the Java platform
>Java platform

you:
>Ah, the java platform, also known as the jvm

it's funny to me that people are actually as stupid as you

>the java platform is bad because this one time I worked at a company that had a really shitty codebase I inherited
You pretend to know what you're talking about, but you're really fucking stupid.

>HPC
should be called Dumb Fuckin Numbercrunching
>You're speaking out of your ass.
That's probably the lamest insult and I want you to come up with a better one in future.
>bleeding edge hardware
more like architectures and cache sizes from the early 90s
>and proven techniques
...
>t. someone who's currently doing a Pretty huge Dick in DFN
I'd even take you more serious if you had a meme degree like deep learning or data science.

SALT? SALT

Dude seriously the world runs on Linux because its based tailored to the needs of everyone with a simple idea: Need something done a certain way? Build it yourself.

Linux community needs to take desktop seriously, set some standards that everybody would be happy to follow. For one, I believe that untill KDE and Gnome merges somehow, Linux desktop still won't be usable for many years to come.

>programming tools shouldn't work to prevent mistakes, bugs, and security problems caused by human error
you probably think everything should be written in c99 and bash, right? laughing girls dot jpg

>should be called Dumb Fuckin Numbercrunching
It's trying to do numbercrunching in the most efficient way, but whatever.

>That's probably the lamest insult and I want you to come up with a better one in future.
Unlike you, I don't come here for the sole purpose of insulting people. That's called shitposting, but nice of you to admit that's what you're doing.

>more like architectures and cache sizes from the early 90s
Again with the speaking out of your ass.

>I'd even take you more serious if you had a meme degree like deep learning or data science.
"Data science" is essentially what "business intelligence" was 10 years ago: a buzzword for "applied statistics".

...

>it's the IDE and compiler's fault I am allowed to be an absolute moron and shoot myself in the foot
If you want to be handheld, I suggest you do something else for a living.

>unlike you fucking infantile plebeians i never have my hand held. i program in plain fucking risc-v and have never had a bug in my code once in my life. i live on an island by myself and generate power without nobodys fucking help using a bicycle. you are all children in my eyes

>supercomputers is a market where quality doesn't matter

say that again slowly

>doesn't rely on testing, code reviews, programming conventions etc to identify and minimize the number of bugs in production code
>blames the tools rather than his own incompetence

using linux on your home computer is straight retarded. great for servers though
inb4 muh privacy fedora tippers

The top 500 supercomputers all run linux btw

According to OP, that doesn't matter, because apparently supercomputers is a market where quality doesn't matter () and they all use antiquated designs anyway so gaming rigs would perform better ( and )

...i've never worked at a place that hasn't done those things? are you autistic? do you actually think that you can only have testing, code reviews, and code conventions OR strong static tools? like these things aren't exclusive of one another you know

>It's trying to do numbercrunching in the most efficient way, but whatever.
Aka the most trivial thing to design processors and compilers for. It's all JIT people ever brag with, forgetting that actual software isn't written like this. I could shove a rod of metal up muh dick and call it a supercomputer.
>Unlike you, I don't come here for the sole purpose of insulting people.
Then you're an idiot, because that is the whole purpose of forchinks. That's what it was carefully designed for and meant to be used for.
en.wikipedia.org/wiki/SW26010 proves me right, by the way.
>"Data science" is essentially what "business intelligence" was 10 years ago: a buzzword for "applied statistics".
Exactly. I even take those more serious than DNF people.

>...i've never worked at a place that hasn't done those things?
Yet you are implying that all those things are impossible to do in Java, because it's supposedly just so god damned bad. lol

>Aka the most trivial thing to design processors and compilers for.
Apparently it's not so trivial, otherwise I wouldn't be publishing papers now, would I. Anyway, HPC is more than just processors and compilers you know.

>It's all JIT people ever brag with, forgetting that actual software isn't written like this
That's the point, most software is written by people who doesn't know how computers work.

>en.wikipedia.org/wiki/SW26010 proves me right, by the way.
In what way? What do you even mean?

It doesn't. One node can catch fire and kill a janitor, but since they are basically cheap metal/silicon blocks and do everything parallel it can be replaced easily.
If your point is that supercomputers are optimized to their niche it is that they are not nearly as optimized to their niche as gaming stuff to its niche.

>Then you're an idiot, because that is the whole purpose of forchinks. That's what it was carefully designed for and meant to be used for.
M-muh internet hate machine

>Tesla p100
>cheap

ok let me go over this for you

> null references
this should be obvious. it shouldn't be possible to access a possibly null value, and create a runtime exception. c# finally got this right just recently. other languages should follow in its path.

> exception handling
try/catch is among the biggest mistakes in programming language design. it creates extra invisible code paths which even the best developers can fail to account for. it's stupid poor design to not simply treat errors as values and even somehow conflate interrupts with those normal runtime errors

> java "oo"
this obviously isn't what kay considered object oriented programming to look like. it's so hilarious to me to see people have to create mocks in their tests for shit that absolutely doesn't need it, because they decided that all their code needs to be contained into classes that...all have their own internal state for some reason? literally why would people inflict that on themselves. And it's so obviously stupid at this point to try and organize code through a hierarchy of inheritance/children now. there's literally no reason to not simply use interfaces/whatever your language calls it

this could go on and on and on but i have better shit to do

>I write Java in a non-ideomatic way and therefore it is Java's fault that it crashes
Just admit that you're a second year CS student that just learned Haskell and you're gloating about it.

>Apparently it's not so trivial, otherwise I wouldn't be publishing papers now, would I.
I could publish papers on art history, too and call it a day.
>Anyway, HPC is more than just processors and compilers you know.
I know, but I have chronic disdain for most forms of networking.
>>en.wikipedia.org/wiki/SW26010 proves me right, by the way.
>In what way? What do you even mean?
If you have to ask...
For example, if your problem is that embarrasingly parallel that you can afford to have a NoC you should know you're running EZ mode compared to what x86 vendors do on normal CPUs.
>I have a small dick and want to compensate with big machines
They are rather cheap given that they are build for this purpose solely. Also, [insert random joke about housefires being expensive]

>not catching null pointer references in unit tests

>I could publish papers on art history, too and call it a day.
Do that then.

>I know, but I have chronic disdain for most forms of networking.
I don't even know what to respond to this.

>For example, if your problem is that embarrasingly parallel that you can afford to have a NoC you should know you're running EZ mode compared to what x86 vendors do on normal CPUs.
The point is that few HPC workloads are optimised exactly for one type of architecture, and you usually don't justifying building clusters and supercomputers for a single workload (although some do).

Also, modern GPUs are becoming increasingly effective at SIMD, which x86 processors used to be good at.

>They are rather cheap given that they are build for this purpose solely.
Ironically, they are called general purpose GPUs and they are powerful as general purpose (parallel) processors.

You seem to be under the impression that anything parallel is super easy. This is ridiculous, you might go around spouting "MOAR CORES" which is obviously something Sup Forums mocked rather than ment seriously.

You're not very intelligent . I feel bad for You need to think more to be honest. Nothing else to say

i dont get it what does this mean i've never used haskell

>tests are better(?) than static safety
this is why software development is a joke

>can have static safety through catching a nullpointerexception
>refuses to use static safety because "exceptions are bad hurr durr"
Moving goalposts - the game.

>linux is shit because it hasn't replaced windows and macosx yet - the post

>static safety through catching a nullpointerexception
>catching an exception
>at runtime
>static

do you understand what static means LOOOL

all that namecalling just because Sup Forums is lacking individual user ID like Sup Forums has as standard. :-D
I laughed my ass off.

>I don't even know what to respond to this.
It's not my fault admins usually are the worst form of spergs. Software developers only are psychopaths and EEs are mostly just sabotaging their crap with halfassed code, but that's usually harmless in comparison.
>You seem to be under the impression that anything parallel is super easy.
Not really, but it still seems like a privilege to me, compared to not being able to run stuff parallelized.

>this is why software development is a joke
Or: This is why software development isn't engineering.

>declare function that throws a nullpointerexception
>nullpointerexception isn't handled in calling function
>compiler error
>not static safety
???

The act of catching the exception isn't static checking, the act of declaring exceptions is.

Does it really help on Sup Forums?

lets get real for a moment. You install windows 10 in about 45 minutes and for most linuxes its about the same but remember. Biggest base linuxes are about 1 gigabytes or 2 while windows 10 can be more than 5 gigabytes of data. And what happens when you open them. On the linux there are about 2 functional buttons, 3 are nonfunctional and in those two buttons there are some pictures to play with. Maybe slide down or up. On windows though, you feel like a god. Its professional its fast, its all functional. You have a godlike user interface. The thing that the linux does not have. You have those two buttons in the linux, but you still search google for codes to write in terminal. Linux is free because its not worth a penny. Nobody would sell a paper without taking the money. Just not sure why people still buy android while windows phone does 10 times more. Trust me. When you work with windows you know some genius ground breaker dedicated himself to give you an operational os

>It's not my fault admins usually are the worst form of spergs. Software developers only are psychopaths and EEs are mostly just sabotaging their crap with halfassed code, but that's usually harmless in comparison.
I have absolutely no idea what you're talking about here... This is just incoherent rambling to me. How exactly does this relate to networking in HPC clusters?

>Not really, but it still seems like a privilege to me, compared to not being able to run stuff parallelized.
Again, you fail to realise that the situation isn't so easy. Workloads have varying degree of parallelism, especially when you have data dependency. Moving data around between nodes is anything but trivial.

could you show me what code produces that error at compile-time?

I apparently have forgotten how NullPointerException in Java is a special kind of exception that isn't required to be caught. In my defence, I haven't used Java in 7 years.

For regular exceptions, a compiler error is issued.
class NullP
{
static private void foo(String arg) throws Exception
{
if (arg == null)
{
throw new IllegalArgumentException("ya doofus");
}

System.out.println(arg);
}

public static void main(String[] args)
{
foo("test");
}
}

good on you for both admitting to a mistake and also not using java for 7 years

I like you.

OP is right. I prefer Windows with Chrome as a browser. That way I get the full botnet experience.

But do you also enhance your experience with some kind of browser plugin and fully activated IME and additional anti-virus?

No user. Truly not. But in the beginning of the conversation here it was all about who said what in wich comment. And that was funny to read.

But the way things being discussed in the programming and software developing world is the reason i NEVER ask for help in any case i really would need.
I made it a hobby instead of making it a job.
My detcord is pretty short and i would prolly chimp the fuck out in such a scenario.

this is the best post in the entire thread

many programmers actually have autism or just trouble with social situations so they're really bad at like interacting with people, like worse than the average user

Gnome has been the standard for a while now.

>I made it a hobby instead of making it a job.
Well done.

Righgt!!! that is also my experience.
I once worked at a company and we developed some SQL database programm stuff and a softwaredeveloper totally outraged on me and stalked my telephone the whole fucking night, just to prevent me from getting asleep. It was pure sabotage. The next moring at work, i were in total zombie-mode and beaten the fuck out of that fatso. Got fored for it but it was well worth the trouble i got. Since then i have never doubleclicked on a compiler program on any kind of job.
That fatso were not the only one harrassing me for random unimportances. they are mostly bottlepissers and isolated sociaopaths.
Not to draw all devs over the same shitpile, but the average dev is a catastrophe.
I went into CNC machining and machine engineering instead. That way i can still type some code and are able to skip the psychos.

>machine engineering
hard to believe this is a psycho-free zone

the more you go hardware related stuff or better said to the mechanical components, the more based these people are.
sure, you can have different views on using specific angles and axles and stuff but it's more of exchanging ideas to a possible solution that a rant with harsh words just because ones ideological way of programming got butthurt be the other one.
And very seldom we takl such stuff via internet or in chatrooms. We do it mostly IRL. that's also a way to prevent people from behaving nuts.

Bump

Cool madlibs, bro.

OK anons, i really didn't want to burn up the ongoing conversation about linux in this thread.

My experience with linux is some 15 years old and i basically only tried out Suse Linux (forgot the version number) and Ubuntu/Kubuntu 6.03 or what it was.
I was overall not very happy with some things i myself could figure out and solve, but what about average joe using this system being confronted with a working sound but as soon as you start the VLC-player or some other one, sound where totally missing despite the codec already installed. Codec wasn't bound correctly and therefore not working. Why must that be the case? I mean as a client version, it targets the end user and the end user is normally not very experienced in using the console to fix that. He will then, if such problems repeatedly occure most likely drop Linux again and go back to windows. Wich i also did after some month being tired of the endless config war and raping gooooogle for help and command lines.

I don't know if that got any better now and most of these issues are long gone, but that was according to my experience the bigges obstacle in getting Linux wider spread. And that made me a bit sorry because i appreciate people working together on a thing bigger and greater than the individual being part of the whole thing.
It's simply lost potential. And beside the whole namecalling here i had to laugh about, some of these things at least on Kernel level still seem to remain or are at least discussed at higher temperatures.

Android is better than any desktop OS and it uses Linux. When UI in a phone is smoother than in any desktop OS that's a sign that something's gone horribly wrong somewhere.

Also the only OS in existence with proper fucking scaling.

My guess it's all about the drivers.
I mean stock Android comes well as i have understood without any specific drivers.
Its then up to the devs of Samsung, Sony and other companies to make Android run smooth and stable.

I remember back then that many hardware suppliers were not really linux friendly in giving out the API of their hardware components and chips/sets. Linux devs had to reverse engineer to get things run and i also remember ATI being a pain getting it to work 3D with OpenGL.
ATI never responded, never gave any API's out and almost never released any usuable drivers.

but muh communism...

it is now

I have a better guess. Graphic stacks for desktop OS's were designed in the time where GPUs didn't exist. Neither did multicore systems. And now all that legacy crap code weighs everything down.

Stock Android has the same drivers Linux kernel has. Except manufacturers don't base recieve stock Android, they recieve development kits from SOC manufacturers(like CAF from Qualcomm). Then they bloat it to hell and make it run like crap. The best thing you can do to your phone is to install pure CAF ROM if it's available.