What's the worst deficiency of C that you would like to fix?

What's the worst deficiency of C that you would like to fix?

For me: lack of namespaces.

Other urls found in this thread:

en.wikipedia.org/wiki/C11_(C_standard_revision)
musl-libc.org/
musl-libc.org/how.html
sourceware.org/pthreads-win32/
ooc-coding.sourceforge.net/
github.com/mpv-player/mpv/blob/master/misc/bstr.h
github.com/antirez/sds.
adaic.org/learn/materials/
bell-labs.com/usr/dmr/www/chist.html
programs.ksoutdoors.com/Programs/Become-a-Certified-Bait-Dealer
github.com/BSVino/JaiPrimer/blob/master/JaiPrimer.md
en.wikibooks.org/wiki/Ada_Programming/Types/access#Fat_Pointers
twitter.com/NSFWRedditImage

undefined behavior

So you literally just want to ruin C?

Just use C++ you dumb fuck.

Multi threading.

en.wikipedia.org/wiki/C11_(C_standard_revision)
musl-libc.org/

in the standard library since c11

It's optional, and therefore completely useless.

GCC and Clang support literally never

musl-libc.org/how.html

Understand now, Satan?

>not using pthreads

Not portable

>lack of namespaces
how would you fix that in a backwards compatible manner?
>undefined behavior
but that's the best part!

sourceware.org/pthreads-win32/

Signed integer overflow is undefined behavior. Sometimes compilers """optimize""" it and produce surprising results (i.e., break existing code).

The license terms make it unusable for real projects.

>break existing code
what? the code was already broken! integer overflow being undefined behavior isn't something new, it's as old as C; the code was always broken, just that nobody noticed until recently; you'd rather hide the bug?

not at all

I'd rather they change the standard to define it to wrap according to two's complement like all modern microprocessors use.

why not make a new language altogether? why change C?

array decay, a.k.a., arrays aren't full first class types that can be passed to/returned from functions without their element count getting disassociated.

Why not just make "-fwrapv" be the default and have another option for getting undefined behavior?

how would you fix it in a backwards compatible manner?

Just pass in a struct, dumbshit

the C standard doesn't have switches

Honestly all that's really needed is to make it more convenient that pass around "fat pointers" (i.e., a struct with the pointer and bounds information).

Modula-2 did this, makes a huge difference in robustness..

nobody's saying that workarounds are impossible, but if you had half a clue about the history of C and its security/correctness struggles, you'd know that being correct is cumbersome enough that mistakes do frequently get made.

There's no system.println();
Stupidest shit, you can't even print to the console.

Wow you're dumb.

It's not a workaround when that is what you are supposed to do. If someone passes arrays pointer only then it is their own dumbshit fault for then screwing up and asking for elements outside the bounds of the array. The whole point of C is that it allows you do to such things. If this is a frequent or serious issue for you then you shouldn't be using C.

If you knew anything g about the history of C you would not it intentionally did not add in these features. Yes people have used C and fucked shit up but that is their own fault for 1) fucking up 2) using a language that allowed them to fuck up.

Fuck off you stupid fucking nigger
Holy shit, burn in a fucking bottomless pit, you dumb faggot
Son a bitch, go off yourself degenerate scum
You think you're funny motherfucker? You're not. You know why? You're a sub-human nigger. Go drink bleach

>Wow you're dumb.

Typical C programmer

>It's not the language's fault, ur dum!

It would not take much to make C a usable language. Making arrays a first class type and having true strings would make the language much more pleasant and much more robust.
Of course it will break backward compatibility and so similar improvements will never be implemented.

You can have a custom lib implementing it

>You can have a custom lib implementing it

Not sufficient, it must be in the language spec.
There already are such libraries out there.
Are they popular?
No, because they deviate form standard C practice.
Also, how much software written in other languages depends on the standard C library?
A lot.
Why do you think exploits happen?

Good point.
Which one would you use/recommend?

Because people invent bicycles?

>Which one would you use/recommend?

I don't know Rust well enough to make a judgement, but making a slightly better C is not a technical issue, it's a social/political one.
I don't know what the systems' language of the future will be called, I just know that C is horribly broken.

Lack of anonymous closure types

what kind of steel is that edge, cockroach?

I mean C libraries. There was a nice project aimed at mimicking essential c++ features, but I forgot its name

>design decisions that allow the compiler to segfault or do whatever before the point in the corresponding source code where the undefined behaviour would have occurred, accomplishing nothing except making debugging harder
>the best part
Ctards confirmed for masochists.

This is a good point. Just throw C in the garbage where it always belonged and use an actually decent language.

Look babby, it's not supposed to hold your hand. There are many reasons why it doesn't have these feature. I f you were right and I was wrong then C would have come and gone like the many other programming languages that have in its lifetime. If you can't handle the heat, get out of the frying pan. Stop using C where the kind of security is essential, or get over yourself and use it correctly.

/thread

>C is designed to be a portable language
>well in this instance it's implementation defined

>not taking a joke

>the whole point of C is that it allows you to do such things
>it intentionally did not add in these features
So it was intentionally made to be an elitist pile of shit which encourages and makes it easy for programmers to make errors, then blames them for doing so? Assuming the errors are even detected and don't silently do something weird, anyway.

Okay then.

>There was a nice project aimed at mimicking essential c++ features, but I forgot its name
its called c++

Ada, motherfucker. We've literally had a well designed safe systems language since the 1980s, and everyone ignores it or bashes strawmen of it.

scanf bug

Lack of a string datatype.
char arrays are where so many people fuck up

Probably the small stuff.
Or the undefined behavior stuff which is simply there to make hardware vendors able to do whatever they want in particular circumstances and put extra load on the programmer.

It's really a rather silly discussion to take. Other languages shouldn't be bad enough that a language like C competes if you care about writing quality software.

>Look babby, it's not supposed to hold your hand.

The C programmer's machismo

>There are many reasons why it doesn't have these feature.

Actually only one, it was bad designed.

>I f you were right and I was wrong then C would have come and gone like the many other programming languages that have in its lifetime. If you can't handle the heat, get out of the frying pan. Stop using C where the kind of security is essential, or get over yourself and use it correctly.

You have never shipped a C application and you are just a canadian shitposter, am I right?

no, it was made to give you the most freedom and control possible. Of course you can't fuck up memory allocation if the language does it for you, but then try making a kernel with it.

No, it intentionally doesn't carry the baggage that other languages do which makes it ideal for performance crucial applications such as kernal development. This of course means you need to take extreme care not to fuck up. It clearly doesn't suit your sloppy ass agile pajeet development process so don't use it.

Nah, it was a C library
ooc-coding.sourceforge.net/

Not everyone. Its still used in the military, where safety matters more than anything

>the only choices are the way C does it or full on garbage collection
What a lovely false dilemma you've got there.

It's not Java

C# a already exists but freetards refuse make decent compilers for it.

Besides there's been Ada implementation for Arduino, quite amuzing

C99 is pretty much perfect. I am sure there are some very very very minuscule nuisances, but most of them come from the implementation rather than the spec.

>Ada, motherfucker

I was going to look into it, but at the moment I'm busy.
I know it has a bad reputation among programmers because it's too constraining, verbose and it's affiliated with the US military.
Actually the constraining thing is probably a plus. If C is the freedom, give me constraints.

I fucking love Ada.

It doesn't carry that baggage because of poor design choices made while hacking a language and an OS together decades ago

>char arrays are where so many people fuck up
Elaborate.
I agree that I'd much rather use a type that's just a length value and a pointer to the characters but a string datatype doesn't strike me as "necessary".

It's really that the standard library relies on null termination that's the issue. It doesn't even make much sense. A char is 8 bits. That can represent 0-255. So you're using 8 bits at the end of your string to delimit its length. That's nice, it allows you to have any length and it extends far beyond the 255 which would be your limit if you just used that piece of memory as a length instead. But are you really that constrained if your string is 255 characters long? Why are you using a string in the first place then? Are you not using it for convenience?
Even a 2 byte number is large enough to store any string I've ever used in a program. It'd allow me to determine the length of the string with a single look at the value and it allows you to apply better search methods than a linear lookup.

Yeah. I guess a string type would be nice. But it's really just the idea of null termination for character arrays that's my problem. So many inconveniences for such a minor gain.

>Implying you can't add features without sacrificing speed

Shut the fuck up, idiot. You don't know anything about programming if you actually believe all the rubbish you just wrote.
>all strings consist of 8 bit characters
lmaoing at your life

what a lovely strawman you got there
the point was that the cost of more control is being more prone to errors. If you don't need the control, don't use c. If you do, c is as error prone as it needs to be.

>Yeah. I guess a string type would be nice. But it's really just the idea of null termination for character arrays that's my problem. So many inconveniences for such a minor gain.

I think it was just laziness on the part of Ritchie and co.
It makes splitting strings easier and faster, and you can save a couple of bytes and asm instructions here and there, and that's about it.

Contrasting disparaging descriptions of it with the experience of actually using it is hilarious. The verbosity is the only real complaint that holds up, and the long list of other desirable features more than makes up for it. It's only "too constrained" in the sense that it tries very hard to tell you about stupid shit that will make your program crash at compile time rather than letting it happen at runtime.

I'm not going around ripping on python, Java or whatever for being slow because they weren't designed to be as fast as possible. C was not designed to be safe — that was intentional to avoid the overheads that introduces. If you are using C where these overheads are worthwhile then you are an idiot. Just because they are worthwhile for you doesn't mean they are worthwhile for everyone.

For example they are not worthwhile for me. You are right, I actually haven't had to ship a production C program — that doesn't mean it isn't important work. I use it for high performance scientific computation that gets run on supercomputers. It is ideal because it is fast and I don't have to worry about security or safety because my collaborators and I are the only ones who run it (garbage in, garbage out is acceptable).

So you seriously want a string type that handles any type of character representation possible in a base language?
That's just dumb.
>You don't know anything about programming if you actually believe all the rubbish you just wrote.
What's the problem with pointing out how this archaic idea is actually infecting code. Sure. I can easily write my own string class that's just an int (or whatever I deem appropriate) and a char*. But because C uses null terminated strings for string literals for instance it makes everyone adhere to that in some way.
>splitting strings
Actually that doesn't work imo. You'd have to eat a char from the start or end of the array to keep it null terminated. Or do a new allocation and a memcpy.

With a length+pointer structure you can just make a new pointer to the new location. Calculate the length (length-(originalPtr-NewPtr)) and call it a day. Way easier on every level except it's slightly more memory-waste.

bstr should be standard

like this: github.com/mpv-player/mpv/blob/master/misc/bstr.h

> C was not designed to be safe — that was intentional to avoid the overheads that introduces.

C was "designed" more for easyness of implementation rather than speed. Speed was an accident.

>If you are using C where these overheads are worthwhile then you are an idiot. Just because they are worthwhile for you doesn't mean they are worthwhile for everyone.

C is horrible from a security point of view, and you HAVE to use it, directly or indirectly even if you don't want to. Besides having first class arrays or true strings would not introduce significant overhead (it could actually be faster).

>For example they are not worthwhile for me. You are right, I actually haven't had to ship a production C program — that doesn't mean it isn't important work. I use it for high performance scientific computation that gets run on supercomputers. It is ideal because it is fast and I don't have to worry about security or safety because my collaborators and I are the only ones who run it (garbage in, garbage out is acceptable

What happened to FORTRAN?

I'm definitely going to check it. Any suggestions on where to start?

>So you seriously want a string type that handles any type of character representation possible in a base language?
What the fuck are you even trying to say? You need to support arbitrary character size, that's the only way. It's not ``dumb'', it's quite literally the only sane thing to do. All major languages do it too, it's not specific to C.
>What's the problem with pointing out how this archaic idea is actually infecting code.
First, it's not archaic but grounded in reason and a lot of experience. Second, null termination doesn't stop you from creating more complex string implementation. You've already been linked to an adequate one, here's another one: github.com/antirez/sds. As you can see, C strings are perfect and could very easily be extended further if the programmer desires so.

adaic.org/learn/materials/

>C supports arbitrary character size
OK were clearly not talking about the same thing here. I'm talking about the C language 'string' concept. Nobody is questioning the low level functionality of treating memory as you wish in C.

I don't know of any language that allows you to specify your own string format and have it act as the language standard string type. C certainly doesn't.

>C strings are perfect
Ignoring the fact that it has had terrible effects on C programs for decades now. Doesn't hold up to the standard you yourself put up for strings in languages and doesn't even serve any use for more serious applications.

It's a bad string representation. Plain and simple. I bet you're just baiting though. It's fine. I don't mind you having some fun.

>could easily be extended further
How? There's no way to extend it at all. A char* is a char*. A string literal ends with a null character. Sure. You can allocate an array of bits and treat it however you like. But that has nothing to do with strings in C at all.

Thanks

>I bet you're just baiting though

With C programmers, you can never be sure.

But that's the whole exact reason it's implementation defined, not all hardware uses two's compliment

>laziness
Yeah. Probably it
>Talking about the difference between BCPL string representation which has a byte at the start of the string to represent the length and C which did away with that and went with null terminated strings
>"This change was made partially to avoid the limitation on the length of a string caused by holding the count in an 8- or 9-bit slot, and partly because maintaining the count seemed, in our experience, less convenient than using a terminator."
>bell-labs.com/usr/dmr/www/chist.html
So yeah. Their convenience (because they never did any string processing why should it be convenience on the usage end?) and how they felt it bothersome to maintain the count.

>So yeah. Their convenience

That's the actual Unix philosophy. Easyness of implementation.

>With a length+pointer structure you can [...]
Actually I realize this has the same issue. You obviously need to store the length somewhere.
Anyhow allocating a piece of memory for a pointer and a length is far easier than doing the string copy stuff.

You are fucking stupid. Like, unbelievably so. Look now, idiot, strings consist of characters. Each character is of arbitrary size, depending on the string encoding. ASCII would use 8bit characters, whereas Unicode can use 8bit, 16bit, 32bit, and so on. Not only that but the system there is not so straightforward as plain text ASCII. So literally every language that supports, say, UTF-8 also supports arbitrary size of character. That means your beloved Python or C# or whatever faggy language your are using.

Oh, and all issues with C strings come from idiot programmers (such as you) who have no idea what they are doing but they are still doing it nonetheless. The solution is to use software written non-retards. All problems solved!

You also seem to not differentiate between a string and a string literal, which are two very distinct terms.

>That means your beloved Python or C# or whatever faggy language your are using.
I'm a C programmer primarily.
>insults, no retort
Ok so you're just admitting defeat and conceding that you're being retarded for thinking that C supports arbitrary string encoding. Your library or code might support an arbitrary string encoding but C sure as hell doesn't.
>You also seem to not differentiate between a string and a string literal
I do. I clearly do. I also differentiate between a string representation and a (in the context of C) string representation. Which you do not.
>"None of BCPL, B, or C supports character data strongly in the language; each treats strings much like vectors of integers and supplements general rules by a few conventions."
>bell-labs.com/usr/dmr/www/chist.html
Sorry for being consistent with the wording of the creators of the language?

Send in your application at:programs.ksoutdoors.com/Programs/Become-a-Certified-Bait-Dealer
I'm sure you'll pass.
>The solution is to use software written non-retards.
True. But we're discussing a specific language here so we're kinda constrained. And I'm sure you've had to deal with C-strings (as they come, out of the box) at some point in your (seemingly) limited programming career. Everyone deals with it. And as it turns out, not even Linux is written by people who aren't retarded. If you look at the Linux API they take C-strings (char*) when you wanna open a file. And guess what. They rely on null terminated strings. You can't input your whatever-string format and specify a length.

SO yeah. They're kindof a big deal. I'm personally not entirely sure on why you'd write API like that. I certainly don't if I can help it (outside demand for instance). It's a language problem at its core.

Give a middle finger to backwards compatibility
Make C 2.0

>C 2.0
github.com/BSVino/JaiPrimer/blob/master/JaiPrimer.md
Not really a C 2.0 clearly.
But its goals are nice.

>Arrays do not automatically cast to pointers as in C. Rather, they are "wide pointers" that contain array size information. Functions can take array types and query for the size of the array.

Noice

*yawn*

en.wikibooks.org/wiki/Ada_Programming/Types/access#Fat_Pointers

>*yawn*

We are in the context of C, remember.

No classes it the biggest problem

You've got structs and function pointers.

>Speed was an accident.
You are a confirmed moron.

If you have to use it that is your problem, not a problem with the language.

Fortran is great for problems on regular domains where data can be operated on in blocks. The work I do involves irregularly structured problems (complex networks) and C is in general better to use.

C is so dominant because it is so good at what it does well. You complaints are like moaning about a Ford Fiesta being shit at off-roading... it wasn't designed for doing that so why would you use it for doing that, or at the very least why would you complain that Ford Fiestas are shit at off-roading.
>B-but if the designers of the Ford Fiesta had have done that l-less people would get stuck off-road.
Moron.

int num = 0;
num =(num++)+(num++);

>You are a confirmed moron.

What's the deal with C programmers calling other people morons?
C was designed to fit on crappy machines, and so is quite low level. You don't say assemby was designed to be fast, it was designed to map 1:1 to the cpu architecture, that's why it's fast, and the same reasoning is valid for C.
Also once C got popular, architecture tried to fit the language, so it became even easier to write fast code in C.
You are not as smart as you think you are.

>What's the deal with C programmers calling other people morons?
C programmers tend to have the worst social skills of all computer scientists.

Why would you want a language that is a one-to-one mapping of the architecture? Because the binaries you compile will be fast you moron.

And I am not calling other people a moron, just you and anyone else who think C is shit because it is low level. It might be shit for you purposes or many other purposes, but it is good for what it was designed to do which is why it is still the dominant language for kernals and embedded systems to this day.

>C programmers tend to have the worst social skills of all computer scientists.

Don't forget the Dunning Kruger effect thing

>And I am not calling other people a moron, just you and anyone else who think C is shit because it is low level.

C is shit even for a low level programming language, it's deeply flawed.

As usual for this thread in its countless incarnations, the clearly most undeniable answer:

>array decay

cuts the deepest:

> ...

Clearly any callee or returnee dealing with arbitrary length arrays needs to convey length information (excluding only the case of statically sized buffers, which should get their own special treatment).

This is *not* the same thing as dynamic bounds checking, which the zealots always try to conflate into being the same thing.

If the information needs to be conveyed and preserved anyways, it would only have been a pure positive for the language to offer syntactic sugar for an additional fat pointer type.

Bonus: this would probably also have killed C strings in their infancy, which in retrospect were more of an anti-optimization if anything.

> other acceptable answers include fixing all the undef behavior and fixing the preprocessor semantics especially unhygenic macros

What's wrong with prepending cuck_ in front of everything?

I don't understand namespaces. Like, what's the use case?

No one cares about Windows desu

>char[] array;
Instead of
>char array[];

This is valid, but make it standard
>char* ptr;
Instead of
>char * ptr;
Or
>char *ptr;
To declare a pointer to char.

>char c = *ptr;
to dereference is ok.

I prefer C# for non critical/embedded stuff tho.

>add namespaces
>add templates
>call it C+