Has any one language done more damage than this piece of shit?

Has any one language done more damage than this piece of shit?

>you just can't handle memory management, you pleb
Literally not an argument.

There are plenty of languages that let you manage memory manually that don't have asinine features like null-terminated strings or a lack of a distinct bool type.

Face it: C is just pure shit. The fact that smart people (like me) can contort it to compile into fast programs doesn't change the fact that it's flawed by design.

I work with C and I agree. C is godawful and archaic, but you can't get around using it because it's 1) pretty much industry standard since everyone use it, including tonnes of open source software project and 2) for low-level stuff it works beautifully because it's basically just a portable and easier maintained assembly.

> smart people (like me)
Brainlet detected.

What do you need a bool type for all it is is a bit with a explicit nameing convention. Bools are Bloat.

Do you have shit for brains? C wasn't designed to be a high-level language and if you don't need it you can just use Java with your friend Pajeet.

>lack of a distinct bool type
C has had a "proper" bool type for 18 years.

Other languages do have null-terminated strings and a lack of a bool type. They just pretend otherwise, by having a fancy string type and 32-bit bools. C doesn't cover up the details.

>What is "sage"?
>Entering "sage" (by itself) into the [Options] field while replying will cause the thread not to bump to the top of the page. Contrary to popular belief, a sage is not a downvote, and should not be used as one. "sage-bombing" or announcing that you've saged a thread may result in a ban.

Useful knowledge for this kind of shitpost.

Actually, bools are not bloat. C++ having a bool type means that for example std::vector template specialisation can be implemented as a bitmap.

Explain how it's less "high level" to have the string length in a prefix byte instead of terminate with a 0

Literally the same amount of memory.

>Limiting strings to 256 characters
>Substrings and iteration with pointers no longer works
That is not an acceptable tradeoff.

It's not about memory efficiency, it's about O(1) vs O(n) for a bunch of string operations.

Using bloat the language as an example why its not bloat.

Don't forget function/method overloading.
Cfags just miss the point of type systems.

Right. With a prefix, string length becomes O(1). it's O(n) with a null terminator.

He is correct though. It also allows other optimisations, such as compressing bool members in a struct to bitfields.

C actually has function overloading, though.
It's honestly a pretty useless feature most of the time, in any language.

switch to c++ and use std::string

C was literaly a "high level language" when it was created due to its portability but the meaning of "high level" has changed since then.

That guy is an idiot.
Having a length field actually allows more efficient substrings, and does not get in the way of pointer iteration at all.

There is another way.

vector is a terrible idea, like most template specializations.

>It also allows other optimisations, such as compressing bool members in a struct to bitfields
No C compiler would ever think of doing this. Struct member layout is extremely important and it's not something they will fuck with unless explicitly told.
Also, C has had _Bool since C99, you fucking mouthbreathing retard.

Also very true. Especially painful when dealing with NULL vs nullptr.

Exactly.

>C actually has function overloading, though.
No?

>Other languages do have null-terminated strings
No they don't, idiot.

>>C actually has function overloading, though.
_Generic can be used to implement function overloading.

>C actually has function overloading, though.
You're talking about C11's Generic macros?

It's only useless if your type system sucks.

Java and c# both are shit at memory management though. C++ is the only one that gives you full control

This.
C++ is literally the best language for both high level and low level development.

>Struct member layout is extremely important
Which is why they are compressed into bitfields. when possible, in order to fit into the nearest alignment boundary rather than scaling upwards.

>Also, C has had _Bool since C99, you fucking mouthbreathing retard.
Where did you get the impression that I didn't know this?

Also, _Bool is a bitfield if you use GCC.

Very simplistic macro magic, but yeah, for sure. I forgot about C11 for a second. Anyway, you need strong typing in order to differentiate functions.

>Which is why they are compressed into bitfields
No. They aren't you fuckface.
Go compile some structs with bools in C and look at its size. No compiler is going to do that.
That is the sort of shit that breaks ABIs.
>Bool is a bitfield
Only if you explicitly set it as a bitfield.

_Generic IS strongly typed.

Make a generic expression that is able to differentiate between NULL and 0 please., I'm genuinely curious.

This has been an issue with C++ forever:
#include

void foo(void* p)
{
puts("hello");
}

void foo(int i)
{
puts("gentoo");
}

int main()
{
foo(NULL);
foo(0);
return 0;
}

>Make a type expression that matches values
Why don't you ask a question that isn't fucking stupid?

Still does retarded shit like this.
#include

void overloaded_c(char c)
{
printf("char");
}

void overloaded_i(int i)
{
printf("int");
}

#define overloaded(X) _Generic((X), \
char: overloaded_c, \
int: overloaded_i \
)(X)

int main()
{
overloaded('x');
}

C++'s type system might suck but it least makes a token effort to patch over these holes.

>I can't do what's asked
Guess C doesn't have so strong typing after all then. Thought so.

The fucking stupid thing here is that NULL is not a pointer type, it's an integer type.

C++ has solved this with nullptr, which is of type nullptr_t which is a pointer type that implicitly converts to all other pointer types.

>Hurr durr, I'm a retard
Please learn the semantics of C before you try to criticise it.

nullptr doesn't have this problem.

>he's starting his computer, booting an OS that's written in C
>he's opening his browser, written in C
>he's going on a certain website, on a server written in C
>then his JavaScript implementation (written in C) opens up a dialog box
>he's writing a post about how bad C is

OP, I don't even..

>_Generic IS strongly typed
You made the claim, retard.

>wants function overloading
>thinks a char literal being an int isn't retarded

>The fucking stupid thing here is that NULL is not a pointer type, it's an integer type.
In C, it's definitely a pointer type.

nullptr is just stupid shit to patch over the fact that C++ doesn't have automatic void * conversions.

#include

#define type(v) _Generic(v, void *: "stupid", int: "fuck")

int main()
{
printf("%s\n", type(NULL));
printf("%s\n", type(0));
}
stupid
fuck

I know C's semantics, and I'm telling you that they're fucking stupid.

>hurr how can you criticize capitalism when you WEAR CLOTHES

This is what you sound like right now.

"We can do better" doesn't mean "I'm going to ditch everything and shit in the woods now."

Why are you relying on implementation defined behavior?

I'm not. The standard definitely says NULL is defined to be (void *)0.
It's just that it's underlying bit representation may not be zero.

>nullptr is just stupid shit to patch over the fact that C++ doesn't have automatic void * conversions.
No, it's "stupid shit" to deal with the fact that implicit void* casting weakens type safety.

And great example, now do it with a char literal or with a void* pointer type and an int* pointer type.

#include

#define type(v) _Generic(v, void *: "you're", int *: "still", int: "stupid")

int main()
{
int n;
printf("%s\n", type(NULL));
printf("%s\n", type(&n));
printf("%s\n", type(0));
}
you're
still
stupid

I meant to change the 0 to a char literal. The results don't change, though.

The cast may or may not be there. Its implementation defined.

Fucking retard CIA nigger

#include

#define type(v) _Generic(v, int: "stupid", char: "fuck")

int main()
{
printf("%s\n", type('x'));
}

That has literally nothing to do with _Generic.

>_Generic doing the retarded thing is nothing to do with _Generic

And they all have endless bugs and exploits.
Your point?

No, it has to deal with the fact that typing in C is weak.

#include

void foo(char c)
{
puts("stupid");
}


void foo(int i)
{
puts("fuck");
}


int main()
{
foo('x');
foo(0);
return 0;
}


$ g++ overload.cpp && ./a.out
stupid
fuck

_Generic is doing precisely the right thing.

That has nothing to do with typing strength, you retarded fuck.
In C, character constants are typed as ints. In C++, they aren't.

>the right thing

Implying

See , fuckface.

>character literals are integers
>the right thing
Wow, C autists are really going off the rail with this one.

That's fine as long as your language is weakly typed and lacks overloading, but as soon as you introduce overloading it completely fucks the semantics.
This is why C++ turned char literals into chars, and deprecated use of an integer literal as a macro for pointers. In any language where types are important, defining a constant to have a different type to the thing it representsis utterly braindead.

>In C, character constants are typed as ints.
Because C is weakly typed and it doesn't really matter.

>In C++, they aren't.
Because C++ is strongly typed and overloading wouldn't work if it wasn't.

Stop changing the fucking subject.
I only said _Generic is strongly typed (it won't do any automatic conversions).

And I'm saying _Generic was a mistake because C's type system makes it do stupid shit.

>I only said _Generic is strongly typed
And that was moving the goalpost, the post you replied to (namely ) talked about strong typing in C++.

>I only said _Generic is strongly typed
And what's the fucking point of that when it the result is inconsistent and counterintuitive because the rest of the language sucks balls in regards to defaultly typing everything as int?

No, _Generic serves its purpose. I'm pretty sure it's just there so they didn't have to rely on "compiler magic" for , not add a billion new functions for , and they thought they may as well give the same power to programmers.

>putting lipstick on a pig

What inconsistent behaviour? _Generic matches exactly what it says it does.
Also, nobody really gives a shit about char literals. They are very rarely used, except for when matching characters in a string.
When was the last one you passed one into a function.

That last sentence was supposed to be a question.

>Y-you don't need it anyway
How about passing a null pointer to a function, you apologist fuck?

It's funny how Cniles will defend a feature that's even more botched in its implementation than anything in C++ simply because it isn't C++.

>vector for a bitmap
why not just use any integer
and if you are really fucking crazy about your bitmap
use a fucking array or vector of said integer type

>Literally not an argument
Not an argument

Learning C as a first language fucked me up. Things I thought were common between all programming languages like vectors, string manipulation, memory handling, etc. turned out to just be "a C thing". I wish I had just learned Java or Python first.

You understand that a bitmap has the benefit of being able to compact at least 8 values into a byte, right?

>array or vectors of integers rather than a bitmap
Great idea, lets bloat memory to hell and beyond. People have 16 GB, just use it all!!

accessing a bitfield is going to be so much less efficient than a plain array of bool that memory concerns are basically irrelevant.

you must've never seen python and java benchmarks, on modern hardware at that

>accessing a bitfield is going to be so much less efficient than a plain array of bool that memory concerns are basically irrelevant.
Accessing an array means looking up a memory address and fetching a cache line, potentially evicting something important from the cache. In worst case. In the absolute worst case (this is also true for a bitmap mind you) you might even have to wait for a memory fence just to change a single bool. Bitoperations and masking and or-ing is completely negligible compared to memory loads and stores.

C is typed as strongly as it needs to be folks, it's a systems language get over it.

C++, and I don't know it much, but just looking at the manual, it's an application language - which means types are important.

C - types are handy references is all.

We agree.
The problem is not with C's type system but the _Generic macro. Any feature like _Generic which makes decisions based on types does not belong in a weakly typed language like C.

Sorry haven't encountered _Generic

Sounds horrid and I won't ever touch it.

fwiw I C malloc-less so I use it basically as a glorified assembler. OS calls are verboten, except for debugging of course

Being a systems language is not a valid excuse to have a shitty type system and shitty abstraction support.

Assembly has no notion of types.
C is fine for what it is.

Why the fuck do you need abstraction?

C is not assembly.

And what, are you saying that any language that compiles to assembly must not have types? you're fucking retarded.

>malloc-less
>OS calls are verboten
>glorified assembler
What the fuck are you writing in C? An OS? Very small libraries?

Productivity.
Safety.

Basically just things C programmers don't know.

The wanna be java/c# programmer has spoken.

Be yourself, friendo

Actually, I program Rust, not sepples aka mutant C on steroids.

Is OPs pic really a good book or just a meme?

Don't pretent to be me, fucking faggot Rust programmer.
I program in the best programming language on earth: C++.

Compiling target is not the same as source grammar or intended use case you fuck knuckle

It's a good book if you want to learn the very basics while at the same time learning about the mindset of the people who made it, but don't expect to learn idiomatic C from it.

>unlike SEVERAL MILLION other people, I have trouble with C
>but I am the smart one, and C is stupid
Hmm...

Sounds like embedded systems to me.