Is const a meme? Why the fuck are C and C++ programmers so fucking autistic about making everything const...

Is const a meme? Why the fuck are C and C++ programmers so fucking autistic about making everything const? Is it because of the "hurr durr I gotta protect myself from my own mistakes" meme or is it because there's actual benefits to it? It looks ugly as fuck and is usually unnecessary

Other urls found in this thread:

pcgamer.com/but-can-it-run-crysis10-years-of-a-pc-gaming-meme/)
godbolt.org/g/zPFXmS
godbolt.org/g/QiVVHw
godbolt.org/g/1jiYwC
github.com/jashkenas/coffeescript/wiki/list-of-languages-that-compile-to-js
twitter.com/NSFWRedditImage

const is good for protection but also there to let the compiler know you won't be changing the value of something. this allows the compiler to make further optimizations it wouldn't be able to if you didn't mark the data as const.

muh compiler optimizations

>there's actual benefits to it?
Yes, it helps improving performance. Non trivial codes make it hard for a compiler to determine if the type is const or not.

alright so like what if the compiler just sees all my shit and then makes it const for me why don't they just do that huh

not worth making my code look fucking weird as shit

const int* const something(const int x, const int y, const unsigned int z, const int* const&) const;

it's an info for compiler so it can do additional optimizations based on the knowledge of object not being mutated, it's not for programmers to avoid mistakes

the compiler cannot make your shit const for you because it cannot understand your intentions. it only understands your code.
protip: it thinks your code is shit

yeah well your compiler thinks you're an ugly piece of shit now stfu

In any case, is there any reason I should use it in my own code considering I'm not making anything demanding enough to deem it necessary? Is the performance impact that huge?

On a small program, the performance gain would be minimal. But it would be good form to mark data you aren't changing as const because it can help stop certain bugs and the performance gains over time (as a project gets bigger) will add up.

const char puts a null terminator on the end for you where char does not. a lot of things require the string to be a constant.

>Is the performance impact that huge?
No but it adds up. Otherwise you are going to end up with slow as fuck program that everyone likes to laugh about. (For example, Crysis: pcgamer.com/but-can-it-run-crysis10-years-of-a-pc-gaming-meme/)

if you are not making codec or need to optimize out every potential cache miss than probably no reason

This is important when using someone else's libs. For example it method is marked as const, you could call it from other const method. But without it, the compiler would scream.
This is more like an information of programmer's intention of what this code should do. And it makes debugging easier.

>or is it because there's actual benefits to it?
Yes, compiler optimizations, which can optimize using the knowledge that it can't be mutated.

You don't have to make value parameters const you retard.

>const char puts a null terminator on the end for you
No it fucking doesn't you retarded piece of shit.

const is basically never used by the optimizer

>absolute state of Sup Forums

Wrong.
Trying playing with godbolt some time.

I'm quoting chandler carruth, so whatever.

No const: godbolt.org/g/zPFXmS
const: godbolt.org/g/QiVVHw

He is wrong.

It's simple things like
for(auto i=var.begin(); i != var.end(); i++
{
var.non_const_function();
}

This forces the compiler to rerun end() every single loop because the non const function could change the state of var, hence changing the value of end()

I assumed the OP was talking about function arguments and other local stuff, obviously globals and C++ const functions are wildly different.

c and c++ are shit
use rust you retarded faggot

t. manchild retard

To get more girls into programming. #kodewithKarlie

just typedef that shit or somethin, nigga, what is ur problem

int main()
{
int n = 10;
int arr[n];
}


>compiler makes an underlying dynamic allocation doing a lot of shit behind your back

int main()
{
const int n = 10;
int arr[n];
}


>compiler allocates the space on the stack - no dynamic allocation needed

No

please move to Javascript and never comment on C again

Wrong you fucking retard.

fuck off soroszilla shill

that thread was apparently posted by another retard who hasn't programmed anything more complicated than fizzbuzz

>const parameters
dumb nigger

that's the whole thing i'm making the thread about you dumb fuck, why do people always put const in parameters?

you already got the answer to that
fuck off

A pointer or reference being const is different to its referent being const, dumbass.

VLAs != dynamic memory allocation.

Wrong, faggot. const has nothing to do with being a pure function. A call to a function which takes only const parameters may still mutate a static or global function, and therefore cannot be elided.

it literally is
only it's allocating to the stack and not the heap

That's because they're constexpr.
constexpr is implicit here, even if it's not annotated - and constexpr is what allows the compiler to fold that constant, not const.

Replace it with something that is not constexpr like a member of an object and see what happens.
godbolt.org/g/1jiYwC

>compiler optimizations
I thought const was just an explicit promise to other programmers that your functions / prototypes won't change shit

It's bullshit. Const is only a promise.

int x = 10;
Compiler will never know when you will fuck him up with some x=11 or similar shit, and you WILL fuck him up. It will read, line after line, waiting for the inevitable new assignment, the sweat covering its forehead. Every night it will kiss its wife and children goodnight, pretending everything is ok, but if you stay silent you can almost hear its heart pounding in its chest, the stress under its eyes.
const int = 10;
Good shit dude, ya need x? No problema pal, I just gon' get that 10 for ya.

>const int = 10;
>ya need x?
how did the compiler know?
Roman compiler?

The compiler can deduce that something is const. It is not for optimization.

g++ at -O2

void something();

extern int z;

int testme(int *k)
{
int res = *k + 3 + z;
something();
return res + 2 + z;
}


This turns into 9 instructions:
testme(int*):
movl z(%rip), %eax
movl (%rdi), %edx
pushq %rbx
leal 3(%rdx,%rax), %ebx
call something()
movl z(%rip), %eax
leal 2(%rbx,%rax), %eax
popq %rbx
ret


But with const it turns into 12 instructions:
void something();

extern const int z;

int testme(int *k)
{
int res = *k + 3 + z;
something();
return res + 2 + z;
}


and
pushq %rbp
pushq %rbx
subq $8, %rsp
movl z(%rip), %ebx
movl (%rdi), %eax
leal 3(%rbx,%rax), %ebp
call something()
addq $8, %rsp
leal 2(%rbx,%rbp), %eax
popq %rbx
popq %rbp
ret

That's curious. I wonder how they benchmark.

Immutable things are good. I for one never let anything j mutate

>1GB/s garbage

I'm a JS developer and I'm also autistic about const.
I also use Immutable.js everywhere I can.

I only use variables as a last resort.

Variables are the gateway to bugs.

Why don't you just write Scheme?

>this allows the compiler to make further optimizations
No. const is worthless in C++ optimization because reinterpret_cast and const_cast exist.

I would if I could make money with Scheme.

const_cast is undefined behavior if the pointed to object is not const. It only successfully removes constness if the constness was introduced through pointer conversions.

*if the pointed to object is const.

github.com/jashkenas/coffeescript/wiki/list-of-languages-that-compile-to-js
I see Scheme transpilers.

Here you go. I did just two runs at "performance" cpufreq governor.

Assuming that something() is just:
something:
ret


First version: 3.535s, 3.554s
Second version: 3.659s, 3.647s

main.c:
#define ITERATIONS (600*1000*1000)

int testme_first(int *);
int testme_second(int *);

int z;

int main()
{
int i;

z = 3;

for (i = 0; i != ITERATIONS; i++)
testme_second(&i);

return 0;
}

Why is it doing this?

It tries to be too clever and preserve more across the function call, in such a way that it also has to rectify the alignment of the stack pointer. The stack pointer should be aligned to 16 bytes before a function call on x86_64.

As far as I recall, some functions like printf with a float will crash if the stack pointer is not aligned properly when called.

I guess some heuristic that usually turns out favorable in the general case just fucked up in this edge case.

Dumb nigger thinks he's god but he can't even reverse entropy in a closed system