At what point does software get too complex for one person to understand?

I was doing some reading about how many millions of lines of code various software today contains and I started wondering, at what point does software become so complex that it's impossible for one person or even a small group of friends to actually understand all the aspects of how it works? Modern versions of browsers like Chromium or Firefox for example have more lines of code than versions of Windows from the 90s and are comparable to current versions of the Linux kernel.
>Firefox: 17.9 million lines of code
>Google Chromium: 16 million lines of code
>current Linux kernel: 16.8 million lines of code
>NT 3.1 from 1993: 4-5 million lines of code
>NT 4.0 from 1996: 11-12 million lines of code
>Windows 2000: 29+ million lines of code
>Windows XP: 40 million lines of code
Sources in order of appearance:
openhub.net/p/firefox
openhub.net/p/chrome
openhub.net/p/linux
knowing.net/index.php/2005/12/06/how-many-lines-of-code-in-windows/

Other urls found in this thread:

cs.cmu.edu/~pavlo/blog/2015/09/the-next-50-years-of-databases.html
twitter.com/AnonBabble

That's why you either document your stuff so others can pick it up easily or it dies and falls by the wayside to be replaced by someone forced to reinvent the wheel as soon as you jump ship.

>That's why you either document your stuff so others can pick it up easily
That's really not what's being asked here. Also, most people contributing to FOSS don't need to understand every last bit of what they're contributing to in order to contribute to part of it.

wasn't win7 something like 60-70m lines of code while vista was liek 90m?

anyway, really software gets broken down into more manageable "subsections", such that a team of people can develop and entire subsection of a certain program independent of other groups.
this is probably obvious, but it serves the poiint here that we can break something even as complex as an operating system down into components and have people develop the components independent of each other with managers handling the subgroups and coordinating with each other at a higher level...


in the end no one can know intimately how a piece of complex software like and OS works, rather it takes a team of engineers working together to construct a _collective_ vision

Anyday now we will need AI to write our software since it will be too complex for us to understand. Every prof writes about this cs.cmu.edu/~pavlo/blog/2015/09/the-next-50-years-of-databases.html
>These future systems will be too complex for a human to reason about.

Programming, with bus drivers and doctors will be the first to go when AI matures.

I know (or at least have a vague idea of) how modern software development works with parts of the project being handled by different groups of people. I'm just wondering at around what point does software becomes so large and complex that one person or a small group of friends can't understand it in its entirety. I know there are multiple OSes for older 8 bit/16 bit computers that were written mostly or entirely by a single person so it isn't just Terry Davis that's capable of such a thing.

Forget about operating systems, think about software like medical diagnostic software.

They will write an advanced diagnostic AI that insurance companies will push to be standard as it reduces risk of lawsuits for incorrect diagnosis. It will be so complex that nobody will be able to figure out what it's actually doing much like how the procurement regulations for the Pentagon are so complex they literally use Watson to procure weapons contracts. Other things that will be incredibly complex will be genome prediction software to analyze embryos and possibly genetic editing software. It will work, but nobody will get how and just run with it. Then soon we will be reliant on it working, and won't be able to disable it to fix errors, so we'll just keep creating more AI to fix previous AI as that will be much easier.

bump

how are firefox and chrome as big as linux?

>Firefox
>18 Million lines of code
JUST

If you handle it right - and that's a rare skill - the software gets too complex as soon as the features themselves gets too complex to understand for one human.

Operating systems and browsers however aren't good examples, because by design they are hard to keep in a maintainable state, suffering from inner system effects, arbitrary stuff built-in that doesn't make sense, legacy nonsense and other problems.

They basically are their own OSes.
Also, they contain features like VR support, speech synthesis, database systems and multiple compilers.

because Linux is a kernel and not an operating system like Firefox and Chrome.

when you are the developer of that software (or understand it well) and you are unable to rewrite it in week, then it's too complex

When you get past "Hello, World!"

If it's a driver, or a critical component. Yes they do.

These things always makes me chuckle. 18 million lines for a web-browser?
180 thousand is more than enough, and people try and pretend like mozilla hasn't turned into a botnet yet.

When does multiple people get better at understanding a complex problem than single people? I say never. You're mistaking LOC for complexity. Very big projects are modularized by necessity. The complexity is contained within one module ideally.

Of course it'd take forever for a person to get involved with every piece of that software.

Browsers are full blown rendering engines and language interpreters, it includes WebGL, WebRTC and similar massive monsters.

Browsers are botnets by standard, user.

The point you are missing is that there is no reason for them to be that.

>no good reason
ftfy
W3C existing with botnet vendors as members is a reason.

I don't dismiss this point. But there are things called standards and if someone wants to develop serious browser, part of it is to follow the standards no matter how stupid they might seem to be. But seeing how Mozillla hails web VR, I thinks they actually enjoy them.

Precisely

You're the kinda guy who now claims to be bisexual because its now the standard i presume?

html+css+js dont need to be THAT big
charon or netsurf are much smaller

Why do so few people seem to get this?

Started work in this new company as a web dev. And despite them exclusively using standard php they never document, comment, or test anything.

So the result it that there's bugs that completely break the application. every day i'm fixing a missing box here or a client saying "this doesn't work". and it's a really basic function like an email alert

No, social and moral "standards" are highly dynamic and I don't perceive them as standards at all, more like group behavior pattern and have little respect for. Web standards are "written in stone" and will be there for another decades.

how come linux kernel is 70 times larger than plan9's if they do mostly the same thing

if the project is well managed and maintained, you don't need to have one person understanding everything. let people focus on the modules they are responsible and interface with other modules via an internal api if needed.

ball of tape syndrome

Oh yeah right, i forgot that the writers of the standards are gods rather than humans affected by group behavioral patterns. My mistake.

7 million loc (about half) are in drivers

You can keep using console browser, that's on you. Oh what's that? Only web websites work? Enjoy browser that ignores standards.

>You can keep using console browser, that's on you.
Spot on.
Luckily I have so far not found any useful website that it doesn't work with it. Could you give me an example of a good site I'm missing out on with my console browser?

You know why cyberpunk media has people working with simple machines? That stuff can be reasonably audited by individuals to check for naughty edits.

how the hell do you post on Sup Forums from console browser???

i had a good understanding of the linux kernel after one week of browsing the code and reading documents here and there. actually, most of the code come from the modules (filesystems and drivers)

Like i browse any other website? (Note that Sup Forums isn't a useful website though) So try again, Name one useful website i'm missing out on?

>I'm using real browser but will discard anything saying that I don't _need_ or or that it isn't _useful_ regardless of using it.
How you go around recaptchas or catalog requiring JS?

You've dodged my question 2 times. Answer mine and i'll answer yours. Tip: You don't have to solve the captcha.

ok
so its 35 times larger (despite that plan9's kernel also has drivers)

many uni websites, lichess, any website using mathjax or similar to render LaTeX, email web interfaces (inb4 they are redundant and suck - IMAP sucks more), google drive and docs (not living in vacuum and have to coop with other people), many offi websited of tv shows I like
how to avoid captchas? (I'm not paying for pass)

>html+css+js dont need to be THAT big
They shouldn't exist in such a form anyway.

This is why the UNIX way makes sense.

Not necessarily the "do one job well", but the idea of many programs acting together to perform a task.

We had all of these before JavaScript.

Basically you got people making different parts like A and B and then someone to make it all work together

I know with links in graphical mode you used to be able to post since the captcha is simply an iframe. You select the street signs, and then copy and paste the long string they give you into the captcha box.

>REEEEEE YOU ARE AUTISTIC AND LONELY IF YOU DON'T USE THE SAME TOOLS AS MEEEEE
>LOOK TO PLAY WEB-BASED GAMES THAT MAKE ME LOOK SMART YOU NEED TO USE JS SINCE NOBODY PLAYED CHESS ONLINE BEFORE IT!!!!!

Calm down mate. You where wrong, learn to live with it.

Cooperation in the real world usually occurs face-to-face or through chat-clients, and guess what? If you know programming you can easily hook into for example slack. Took me ~15 minutes for text and another 15 minutes for files.

There are email-interfaces that, guess what? Aren't websites!

So, i ask you again, give me a proper example or admit you where simply wrong?