It does indeed feel there is a gaping hole in our software stack as these buffer overflows are only increasing in rapidity. It is time we take a serious look at the epidemic of exploits in C land, and begin to implement real solutions; they are out there.
Itβs hard to deny that easy access to stack, especially unprivileged access, plays a serious role in creating computer crime. How many buffer overflows happen in languages with access checks on stacks? How many buffer overflows are discovered in the Ada each year? None. How many in Haskell? None. How many in Java? None. The list could go on. And yet, mass exploitation in the C-land continue to increase. There is certainly a correlation. But there are other important causes at play as well: the language is an ill-designed clusterfuck of hacks upon hacks.
Of course, mass buffer overflows are only one indication of the security nightmare that plagues the language β the whole language is built on unsafe and insecure code. In the C-land, memory rules are much more lax than that of other popular languages, on par with the assembly and lacking even basic safety features: unless explicitly requested by the programmer.
These are a only a few of the indicators of what may feed into the hopelessness and despair that causes so much distrust in C and its derivatives. The bugs cost real money and real work-hours to be wasted on correcting and debugging the garbage that was compiled by compilers which don't value anything but speed and memory use
Feel free to code in other language. No one is stopping you.
Jack Nelson
You ever wonder if these articles are shared almost exclusively by webdev cunts who think systems programming is monolithic and impenetrable to get into? They fear what they don't understand, so they share another "OMG C IS BAD" article to make themselves feel better about their complete lack of understanding of anything that happens below their cushy javascript environment.
Ayden Hall
>murder should be illegal >uhhh feel free to not kill people bro no ones forcing you haha
Justin Sanchez
this is why c++ NEEDS TO BE STANDARD
Gavin Brooks
C++ has ALL of C's flaws, with none of the benefits added since 1982.
Kevin Foster
Second rate programmers like to justify their life choices working as webdev. No shit.
Wyatt Nelson
>that dumb children argument
Now I know why you can't code in C
git gud, faggot
William Morris
>sepples
Benjamin Price
Every language has attack vectors dumb ass
Jayden Martin
>Believing Java is a secure programming language free of exploits >Believing anything used for a 'large' public is secure against exploits
Jacob Foster
then why is c++ so widely used?
Jayden Reed
Bjarne is good with marketing.
Jonathan Miller
It's not. C++ is only relevant in game development these days, and only because people are afraid of C
Adam Wright
It was a great OOP and come early. That's all. Bjarne can't give away a free Ford GT.
Christopher Reyes
you are right, let's write operating systems in java.. or even better, in python !
Daily reminder it has been imperially proven Ada is easier, cheaper, and safer to use while being more powerful.
Kevin Hall
Yeah good luck getting me to switch over all my incredibly computationally intensive simulation code to Ada or Java.
Doesn't matter what happens, for the kind of work I do (physical simulations) there will always be the need for C. I would bet that this will still be true in a hundred years time (if we're still around).
Jose Wilson
CPPcon is both hilarious and sad. Half their talks have to do with trying to convince people to use C++ for any reason. >y-you should be writing embedded C++! >applications code should come back to C++! >games programming is C++ is still king! >how to discuss C++ with C programmers >s-stop writing C!!!!!
I don't know they even have a con still, fucking nobody writes C++ anymore.
Elijah Thomas
haven't programmed ada in a while, but i love the type system. it's a actually pretty nice language, no idea why it's almost dead now.
Sebastian Reyes
>this actual projection
Asher Morgan
Besides the fact that you're a dumb animuposter, unironically defend this:
#include int main() { char str1[]= "Niggers tongue my anus"; char str2[40];
for (int i = 0; i < 40; ++i) str2[i] = 'h';
strncpy (str2, str1, 5); for (int i = 0; i < 40; ++i) printf("%c" , str2[i]);
puts(""); }
Output: Niggehhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh
Notice anything? How can you possibly justify this? How shit can your "language" be?
Michael Stewart
>Of course, mass buffer overflows are only one indication of the security nightmare that plagues the language
Actually, all high-profile vulnerabilities are related to buffer overflows. There's not much else to blame.
Isaiah Hall
Just don't suck at coding in C, it's that simple.
Jaxson Perez
It should have been banned two fucking decades ago, programmers are morons with an overblown illusion of control and competence.
Ayden Bennett
Bugs happen. You obviously know zilch about software engineering.
Luke Cruz
Bugs aren't exclusive to C. As long as you aren't a retard and know how to debug then you're all set.
Lincoln James
>writing off the end of the array >WHY IS IT DOING WEIRD THINGS?!??!?!
That's called undefined behavior, dumbass.
Gavin Phillips
Type inference is good at fucking shit up too. Race conditions too.
Hence Rust, the execution might be lacking, but their identification of what is necessary to make a language safe is spot on.
Landon Baker
Autism finds it hard to use real strong typing
Elijah Young
But in C they're catastrophic. That's undefendable. Other languages fail properly without compromising your entire system.
You didn't undestand. It failed to nul-terminate a copied string. strncpy promises to nul-terminate your string, but sometimes it doesn't. That's how shitty C is.
Ethan Long
B-but isn't C++ king of languages?
Camden Brown
Bad programmers will ensure you will experience economic damage for shit they fucked up.
Joshua Stewart
>strncpy promises to nul-terminate your string
Are you retarded?
Josiah Barnes
I see nothing wrong with this code
Am I missing something or is user retarded?
Jordan Brooks
As I said, if you have properly debugged your C code then you're all set. If someone else fucks up using your code then that's their fault.
Also I care little for the exception-handling capabilities of other languages when I absolutely need as much efficiency as possible for simulation code (which I do).
Christian James
Maybe 20 years ago. Everyone jumped to other more domain specific languages instead of continuing to use C++, the jack of all trades, master of none language.
Jordan Fisher
So even the manual writers realized that C is inconsistent, and yet you didn't know that yet?
Fuck off.
You're missing the fact that there's no nul-termination in the copied string. C failed. Your buffer overran. Now an attacker can read your credit card number.
Cooper Evans
Sometimes you just can't fix things by going solo. It's well beyond time to gas C programmers.
Adam Foster
>Beware of buffer overruns! What is that if not an admission that the language sucks?
Joshua Robinson
Dude, it's Sup Forums. Just let it go.
Ryan Cruz
>manual writers It's defined that way in the standard. C standard library functions trust the programmer to not do stupid things. If you need null checks and array bounds, you do them yourself, the functions shouldn't be wasting needless clock cycles assuming you're retarded, because it leads to redundant code.
Jason Bailey
>too stupid to write proper C code >too stupid to use asan/lsan/etc to properly test your shit before shipping it >somehow this is the language's fault lmao, fucking webdev brainlets
Jose Myers
Well, you're not wrong at all, but "C creates insecure software and must be banned" is different from "Rust exists".
Luis Perry
>You're missing the fact that there's no nul-termination in the copied string
>I copied characters from one array to another, and it worked WAAAAHHHHH
You overwrote the null terminator dumbass. You're responsible for allocating enough space for it, and if that's somehow not possible, you're responsible for putting it back before attempting to run another string.h library function on it again.
Joseph Hughes
>it's okay if the behaviour is retarded as long as that was the intended behaviour You're not making your case any stronger, C-fags. If anything, you're just embarrassing yourselves further.
Benjamin White
>incompetent programmers create a non-perfect program >compiler compiles the code expection the program to be good >people blame the compiler/the language >shitty programmers get off scott free
Bad programmers in mission critical shit should be BANNED.
Jack Garcia
Meant to quote
Jordan Bailey
Okay, here's another example of C's idiocy:
void wtf(char *str) { char newstr[80];
strlcpy(newstr, str, 80); }
int main() { char c, buf[4096]; int i=0;
while ((buf[i++] = getchar()) != '\n');
i=1; wtf(buf); i=2;
printf("i = %d\n", i);
return 0; }
Input 160 characters. Resulting output: i = 1
Instruction at line 17 (namely, i=2) gets completely skipped, and you get the wrong output.
WTF kinda language is this that randomly skips AN ENTIRE LINE OF CODE THAT YOU GAVE IT?!?!?!?
Austin Gray
When you're copying part of a string into another string, you usually don't want the bigger string to be interrupted like this.
After all, if all you want is a string of first N characters, you can do that with the original string... and not use strncpy() at all.
strncpy() is designed correctly for its use case.
Kayden Collins
Why don't programmers have any kind of real certification process?
William Gonzalez
For most uses there's not a huge difference in timing between C and C++, but for the simulation work I tend to do I need to squeeze every available bit of performance out as possible (time-domain simulations are a bitch) so using C is really the only way to go.
Plus it's slightly easier to call from my Python graphing interfaces (using ctypes) due to C++ name-mangling - obviously you only need to add some "extern"s in c++, but why bother when you can use C which is faster for these simulations anyway.
Bentley Howard
this 100%
Benjamin Anderson
>all other languages don't have this problem >but it's the coders' fault, not my language! my language is purrrfect! Yeah nah.
Nathan White
strlcpy isn't a standard library function fuck off
I also don't understand the sentiment in this thread. If C was wildly unpredictable and impossible to use as a software engineering tool, nobody would use it.
Charles Sullivan
compiler/flags?
Christian Flores
You're right, OP. Instead, let's work with languages whose INTERPRETERS are written in C.
But the fact that you need non-standard kludges like strlcat and strlcpy to make C even usable tells you a lot about the language.
Chase James
Nim's interpreter/compiler is written in Nim.
Oliver Jones
Lets say I was doing 3d simulations. Is OOP a good Idea? If it is, Should I use C++ or some C implementation of OOP?
Leo Hall
Learn the language you're using from inside out. Don't rely on copy pasting stack overflow and cry a river when shit goes bad.
The language was created and is good. If you know how to use it correctly, nothing bad will happen.
So yes, it's the coder's fault.
Jason James
I know the language. I exploit it professionaly. Thanks for keeping using C, all your buffer overflows make my living, you faggot.
Levi Williams
>implying you're getting buffer overflows from me
John Phillips
>But the fact that you need non-standard kludges
No you don't. It says nothing about the language. People are simply too stupid to implement these functions themselves because modern programming ideology scorns implementing your own libraries for any reason.
Eli Hughes
Nim compiles to C.
Chase Edwards
That seems like an issue with strlcpy(). Do that with strncpy() with proper buffer management and then complain. But you don't, all strlcpy() is is a shortcut to doing it right with strncpy(). As long as retards stay away from C it's fine. This is true of any language.
Brody Sanchez
I couldn't tell you without knowing what it is you intend to do. If it's all in the steady state (i.e. not time-domain) then performance isn't really a big issue and you can get away with using something like Python (with numpy of course).
Also if it's a problem which can be heavily parallelised then you'll want to go down the CUDA / OpenCL/GL route (or any other good GPU acceleration techniques).
Daniel Parker
>If C was wildly unpredictable and impossible to use as a software engineering tool, nobody would use it.
Provide an actual argument for that claim.
Mason Allen
Real C programmers don't write code with buffer overflows. Have fun depending on the mistakes of amateur fuckwits.
Nathaniel Perez
Are you the infamous hacker Sup Forums?
Owen Reyes
>If C was wildly unpredictable and impossible to use as a software engineering tool, nobody would use it. The matter of the fact is C is plummeting quickly as modern alternatives emerge.
It's been used for so long due to legacy code and maturity, but it's getting to a point when it's simply unsustainable any longer.
Isaac Hernandez
???
Aiden Wright
You're an useful idiot by promoting a language who's continued use hurts almost all of us, except the security """researchers""".
Are you one of the "ethical" ones who only sell to security agencies the UN likes and large multinationals. Or are you honest with yourself and simply sell period?
Chase Campbell
>That seems like an issue with strlcpy(). No, the error doesn't happen on that code. I mispasted. See
Carson Roberts
>I know the language Sure thing, Pajeet.
Connor Foster
As long as people are writing low-level programs without sacrificing portability, C will never go away.
Thomas Collins
Nobody is a real C programmer then.
Zachary Anderson
I see, your post makes sense now.
Mason Morales
See >No you don't. Of course you do. Those functions are everywhere from the Linux kernel to the iPhone.
Tyler Watson
It was probably originally made in C though and bootstrapped to Nim later on. Then the newer versions of Nim are made with the older versions.
Henry Adams
>linux kernel linux isn't bsd faggot
Brandon Wood
Currently.
Christian Perez
Just because your 1st year degree-course C code breaks horrendously with undefined behaviour, doesn't mean everyone else's does.
strncpy() writes "num" bytes. Not all of them are necessarily copies; it stops copying after it encounters a null terminator in the source, and writes null terminators for the rest of the specified bytes.
strcpy() copies bytes until it finds and copies a null terminator from the source. It doesn't take a "num" argument.
William Wright
strlcpy() is just a convenience function, you can do proper buffer management with the standard library using strncpy() by giving strncpy one less than the buffer size, then NULL terminating it yourself.
Samuel Watson
Yeah. Except everyone else's does too. Buffer overflows in C code are widespread even in high-quality code written by seasoned skilled programmers.