Programming in C must be BANNED

arstechnica.com/security/2016/08/code-dumped-online-came-from-omnipotent-nsa-tied-hacking-group
xorcatt.wordpress.com/2016/08/16/equationgroup-tool-leak-extrabacon-demo
Once again, a buffer overflow has lead to new vulnerabilities in C land. This time all pre-2013 Cisco routers are affected.

It does indeed feel there is a gaping hole in our software stack as these buffer overflows are only increasing in rapidity. It is time we take a serious look at the epidemic of exploits in C land, and begin to implement real solutions; they are out there.

It’s hard to deny that easy access to stack, especially unprivileged access, plays a serious role in creating computer crime. How many buffer overflows happen in languages with access checks on stacks? How many buffer overflows are discovered in the Ada each year? None. How many in Haskell? None. How many in Java?
None. The list could go on. And yet, mass exploitation in the C-land continue to increase. There is certainly a correlation. But there are other important causes at play as well: the language is an ill-designed clusterfuck of hacks upon hacks.

Of course, mass buffer overflows are only one indication of the security nightmare that plagues the language β€” the whole language is built on unsafe and insecure code. In the C-land, memory rules are much more lax than that of other popular languages, on par with the assembly and lacking even basic safety features: unless explicitly requested by the programmer.

Nearly 70% pre-2013 routers are Cisco and are vulnerable to being hacked during the %CurrentYear%.
gigaom.com/2013/02/27/chart-cisco-owns-the-switching-and-routing-world

These are a only a few of the indicators of what may feed into the hopelessness and despair that causes so much distrust in C and its derivatives. The bugs cost real money and real work-hours to be wasted on correcting and debugging the garbage that was compiled by compilers which don't value anything but speed and memory use

Other urls found in this thread:

youtube.com/watch?v=YnWhqhNdYyk
cplusplus.com/reference/cstring/strncpy/
docwiki.embarcadero.com/RADStudio/Berlin/en/String_Types_(Delphi)
docs.freebsd.org/doc/4.3-RELEASE/usr/share/doc/en/books/developers-handbook/x1136.html
github.com/torvalds/linux/search?utf8=βœ“&q=strlcpy
twitter.com/NSFWRedditImage

Feel free to code in other language.
No one is stopping you.

You ever wonder if these articles are shared almost exclusively by webdev cunts who think systems programming is monolithic and impenetrable to get into?
They fear what they don't understand, so they share another "OMG C IS BAD" article to make themselves feel better about their complete lack of understanding of anything that happens below their cushy javascript environment.

>murder should be illegal
>uhhh feel free to not kill people bro no ones forcing you haha

this is why c++ NEEDS TO BE STANDARD

C++ has ALL of C's flaws, with none of the benefits added since 1982.

Second rate programmers like to justify their life choices working as webdev. No shit.

>that dumb children argument

Now I know why you can't code in C

git gud,
faggot

>sepples

Every language has attack vectors dumb ass

>Believing Java is a secure programming language free of exploits
>Believing anything used for a 'large' public is secure against exploits

then why is c++ so widely used?

Bjarne is good with marketing.

It's not.
C++ is only relevant in game development these days, and only because people are afraid of C

It was a great OOP and come early. That's all. Bjarne can't give away a free Ford GT.

you are right, let's write operating systems in java.. or even better, in python !

youtube.com/watch?v=YnWhqhNdYyk

Autismo will hate this.

Daily reminder it has been imperially proven Ada is easier, cheaper, and safer to use while being more powerful.

Yeah good luck getting me to switch over all my incredibly computationally intensive simulation code to Ada or Java.

Doesn't matter what happens, for the kind of work I do (physical simulations) there will always be the need for C. I would bet that this will still be true in a hundred years time (if we're still around).

CPPcon is both hilarious and sad.
Half their talks have to do with trying to convince people to use C++ for any reason.
>y-you should be writing embedded C++!
>applications code should come back to C++!
>games programming is C++ is still king!
>how to discuss C++ with C programmers
>s-stop writing C!!!!!

I don't know they even have a con still, fucking nobody writes C++ anymore.

haven't programmed ada in a while, but i love the type system.
it's a actually pretty nice language, no idea why it's almost dead now.

>this actual projection

Besides the fact that you're a dumb animuposter, unironically defend this:

#include
int
main()
{
char str1[]= "Niggers tongue my anus";
char str2[40];

for (int i = 0; i < 40; ++i) str2[i] = 'h';

strncpy (str2, str1, 5);
for (int i = 0; i < 40; ++i)
printf("%c" , str2[i]);

puts("");
}

Output:
Niggehhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh


Notice anything? How can you possibly justify this? How shit can your "language" be?

>Of course, mass buffer overflows are only one indication of the security nightmare that plagues the language

Actually, all high-profile vulnerabilities are related to buffer overflows. There's not much else to blame.

Just don't suck at coding in C, it's that simple.

It should have been banned two fucking decades ago, programmers are morons with an overblown illusion of control and competence.

Bugs happen. You obviously know zilch about software engineering.

Bugs aren't exclusive to C. As long as you aren't a retard and know how to debug then you're all set.

>writing off the end of the array
>WHY IS IT DOING WEIRD THINGS?!??!?!

That's called undefined behavior, dumbass.

Type inference is good at fucking shit up too. Race conditions too.

Hence Rust, the execution might be lacking, but their identification of what is necessary to make a language safe is spot on.

Autism finds it hard to use real strong typing

But in C they're catastrophic. That's undefendable. Other languages fail properly without compromising your entire system.

You didn't undestand. It failed to nul-terminate a copied string. strncpy promises to nul-terminate your string, but sometimes it doesn't. That's how shitty C is.

B-but isn't C++ king of languages?

Bad programmers will ensure you will experience economic damage for shit they fucked up.

>strncpy promises to nul-terminate your string

Are you retarded?

I see nothing wrong with this code

Am I missing something or is user retarded?

As I said, if you have properly debugged your C code then you're all set. If someone else fucks up using your code then that's their fault.

Also I care little for the exception-handling capabilities of other languages when I absolutely need as much efficiency as possible for simulation code (which I do).

Maybe 20 years ago.
Everyone jumped to other more domain specific languages instead of continuing to use C++, the jack of all trades, master of none language.

So even the manual writers realized that C is inconsistent, and yet you didn't know that yet?

Fuck off.

You're missing the fact that there's no nul-termination in the copied string. C failed. Your buffer overran. Now an attacker can read your credit card number.

Sometimes you just can't fix things by going solo. It's well beyond time to gas C programmers.

>Beware of buffer overruns!
What is that if not an admission that the language sucks?

Dude, it's Sup Forums. Just let it go.

>manual writers
It's defined that way in the standard.
C standard library functions trust the programmer to not do stupid things.
If you need null checks and array bounds, you do them yourself, the functions shouldn't be wasting needless clock cycles assuming you're retarded, because it leads to redundant code.

>too stupid to write proper C code
>too stupid to use asan/lsan/etc to properly test your shit before shipping it
>somehow this is the language's fault
lmao, fucking webdev brainlets

Well, you're not wrong at all, but "C creates insecure software and must be banned" is different from "Rust exists".

>You're missing the fact that there's no nul-termination in the copied string

>I copied characters from one array to another, and it worked WAAAAHHHHH

So... retarded then?


>cplusplus.com/reference/cstring/strncpy/
>Copies the first num characters of source to destination.
Reading comprehension.

>t. Java "programmers" who don't understand the very concept of C programming

>you shouldn't waste clock cycles with security
C-fags, everyone!

Checking array bounds and providing a proper string implementation isn't that expensive. Pascal is just as low-level as C and does it just fine.

pascal strings are limited to 255 characters, that's not very useful

Don't blame standard methods for working exactly as intended.

>thinks it's acceptable to do """security-checks""" when they're not explicitly necessary in performance-intensive C code

Never work on simulations.

Can someone explain the difference between memcpy and strncpy?
They seem identical in behavior, I always use memcpy.

>tells strncpy to copy the first 5 characters
>expects it to copy something that isn't there
Are you fucking retarded?

because microsoft pushed it

Why would one choose C over C++ for simulations? Or do people just kind of end up mixing the two? Trying to get into simulations myself.

docwiki.embarcadero.com/RADStudio/Berlin/en/String_Types_(Delphi)

Congratulations at failing. You earn nothing.

You overwrote the null terminator dumbass.
You're responsible for allocating enough space for it, and if that's somehow not possible, you're responsible for putting it back before attempting to run another string.h library function on it again.

>it's okay if the behaviour is retarded as long as that was the intended behaviour
You're not making your case any stronger, C-fags. If anything, you're just embarrassing yourselves further.

>incompetent programmers create a non-perfect program
>compiler compiles the code expection the program to be good
>people blame the compiler/the language
>shitty programmers get off scott free

Bad programmers in mission critical shit should be BANNED.

Meant to quote

Okay, here's another example of C's idiocy:

void
wtf(char *str) {
char newstr[80];

strlcpy(newstr, str, 80);
}

int
main() {
char c, buf[4096];
int i=0;

while ((buf[i++] = getchar()) != '\n');

i=1;
wtf(buf);
i=2;

printf("i = %d\n", i);

return 0;
}

Input 160 characters. Resulting output:
i = 1

Instruction at line 17 (namely, i=2) gets completely skipped, and you get the wrong output.

WTF kinda language is this that randomly skips AN ENTIRE LINE OF CODE THAT YOU GAVE IT?!?!?!?

When you're copying part of a string into another string, you usually don't want the bigger string to be interrupted like this.

After all, if all you want is a string of first N characters, you can do that with the original string... and not use strncpy() at all.

strncpy() is designed correctly for its use case.

Why don't programmers have any kind of real certification process?

For most uses there's not a huge difference in timing between C and C++, but for the simulation work I tend to do I need to squeeze every available bit of performance out as possible (time-domain simulations are a bitch) so using C is really the only way to go.

Plus it's slightly easier to call from my Python graphing interfaces (using ctypes) due to C++ name-mangling - obviously you only need to add some "extern"s in c++, but why bother when you can use C which is faster for these simulations anyway.

this 100%

>all other languages don't have this problem
>but it's the coders' fault, not my language! my language is purrrfect!
Yeah nah.

strlcpy isn't a standard library function
fuck off


I also don't understand the sentiment in this thread.
If C was wildly unpredictable and impossible to use as a software engineering tool, nobody would use it.

compiler/flags?

You're right, OP. Instead, let's work with languages whose INTERPRETERS are written in C.

Sorry, I pasted by accident the fixed code instead that I use to teach my students how to workaround such language flaws. The original example was taken straight from the FreeBSD documentation and doesn't use anything non-standard. See: docs.freebsd.org/doc/4.3-RELEASE/usr/share/doc/en/books/developers-handbook/x1136.html

But the fact that you need non-standard kludges like strlcat and strlcpy to make C even usable tells you a lot about the language.

Nim's interpreter/compiler is written in Nim.

Lets say I was doing 3d simulations. Is OOP a good Idea? If it is, Should I use C++ or some C implementation of OOP?

Learn the language you're using from inside out.
Don't rely on copy pasting stack overflow and cry a river when shit goes bad.

The language was created and is good. If you know how to use it correctly, nothing bad will happen.

So yes, it's the coder's fault.

I know the language. I exploit it professionaly. Thanks for keeping using C, all your buffer overflows make my living, you faggot.

>implying you're getting buffer overflows from me

>But the fact that you need non-standard kludges

No you don't.
It says nothing about the language.
People are simply too stupid to implement these functions themselves because modern programming ideology scorns implementing your own libraries for any reason.

Nim compiles to C.

That seems like an issue with strlcpy().
Do that with strncpy() with proper buffer management and then complain.
But you don't, all strlcpy() is is a shortcut to doing it right with strncpy(). As long as retards stay away from C it's fine. This is true of any language.

I couldn't tell you without knowing what it is you intend to do. If it's all in the steady state (i.e. not time-domain) then performance isn't really a big issue and you can get away with using something like Python (with numpy of course).

Also if it's a problem which can be heavily parallelised then you'll want to go down the CUDA / OpenCL/GL route (or any other good GPU acceleration techniques).

>If C was wildly unpredictable and impossible to use as a software engineering tool, nobody would use it.

Provide an actual argument for that claim.

Real C programmers don't write code with buffer overflows. Have fun depending on the mistakes of amateur fuckwits.

Are you the infamous hacker Sup Forums?

>If C was wildly unpredictable and impossible to use as a software engineering tool, nobody would use it.
The matter of the fact is C is plummeting quickly as modern alternatives emerge.

It's been used for so long due to legacy code and maturity, but it's getting to a point when it's simply unsustainable any longer.

???

You're an useful idiot by promoting a language who's continued use hurts almost all of us, except the security """researchers""".

Are you one of the "ethical" ones who only sell to security agencies the UN likes and large multinationals. Or are you honest with yourself and simply sell period?

>That seems like an issue with strlcpy().
No, the error doesn't happen on that code. I mispasted. See

>I know the language
Sure thing, Pajeet.

As long as people are writing low-level programs without sacrificing portability, C will never go away.

Nobody is a real C programmer then.

I see, your post makes sense now.

See >No you don't.
Of course you do. Those functions are everywhere from the Linux kernel to the iPhone.

It was probably originally made in C though and bootstrapped to Nim later on. Then the newer versions of Nim are made with the older versions.

>linux kernel
linux isn't bsd faggot

Currently.

Just because your 1st year degree-course C code breaks horrendously with undefined behaviour, doesn't mean everyone else's does.

github.com/torvalds/linux/search?utf8=βœ“&q=strlcpy

memcpy() copies "num" bytes, always.

strncpy() writes "num" bytes. Not all of them are necessarily copies; it stops copying after it encounters a null terminator in the source, and writes null terminators for the rest of the specified bytes.

strcpy() copies bytes until it finds and copies a null terminator from the source. It doesn't take a "num" argument.

strlcpy() is just a convenience function, you can do proper buffer management with the standard library using strncpy() by giving strncpy one less than the buffer size, then NULL terminating it yourself.

Yeah. Except everyone else's does too. Buffer overflows in C code are widespread even in high-quality code written by seasoned skilled programmers.