#include <stdio.h>

#include

int main() {
int hello = 1;

printf("%c\n", hello["hello"]);
return 0;
}


What's the output and why? Don't cheat or I will tell your mom.

Other urls found in this thread:

ghostbin.com/paste/za63q
twitter.com/AnonBabble

Sigsegv

This.

>mfw op asks these basic ass questions

wrong

>compiled the code
wat

It try to access &hello + “o"*sizeof(int) + "l"*sizeof(int)^2 + "l"*sizeof(int)^3 + "e"*sizeof(int)^4 + "h"*sizeof(int)^5 or something

“hello“[1] = "e"

Okay, it compiles and prints the letter "e".
Now the real question is: Why is that?!

Its 'e'
But that doesn't make C retarded. Its part of how it resolves pointers.
hello["hello"]
resolves to
char hello[] = "hello";
*(&hello[0] + 1);


So in short, OP is a fag

>write something that doesn't make sense
>surprised the results don't make sense
Are you fucking retarded?

5["abcdef"] is the same as "abcdef"[5]

Rust doesn't have this problem

>complains about defined and logical behavior
Wow OP, you really suck cocks!

hello = 1
1['hello'] = 'hello'[1] = 2nd character = e

>1['hello']

Can anyone explain the logic behind having such indexing method?

>Look mom I learned something on the interwebs
That trick works with integer arrays.
I will go with undefined behaviour, so either "e" or "OP is a faggot"

int main() {
you disgust me

"hello"[1]
1["hello"]
*("hello" + 1)
All the same shit

this is the only correct way
int main(int argc, char **argv)
{

}

void foo(char bar[5])
{
char baz[5];
}
What are the types of bar and baz?

Drive by question from noob here,

Does the "int" in "int main" means an output of int value, or definition? and how come sometime it's "void" yet still produce an output?? or I interpreted it wrong.

It's the error code from the process.
If a program returns 0 from main it means it completed successfully. If it returns any other value, it failed in some way. You can use that value if you're using a shell on Linux.

Since nobody uses it!

>it's another "I'm did something retarded and C just let me do it!" thread
Post your best and salvage this garbage heap, Sup Forums

An array starts at 0, the 2nd byte in "hello" is e

void is not officially recognized but most compilers allow it regardless. IIRC any program that uses it will return a random value and thus you can't tell if the program was successful or not. Same happens if you forget to add a return statement to your main.

e
Anyone who knows C should get this.
Then again, C is complete garbage anyway.

And people complain about Javascript letting you do stupid things.

What's stupid about arrays?

hello is not a pointer type

In C? arrays suck because they turn into pointers if you sneeze at them

read the standart, nigger

>arrays aren't pointers
???

They aren't.

How?

You asked for it.

#include
#include

// X = Value, Y = Mutable
#define __let(X, Y) _Generic((X), \
/* Basic */ \
default: _Generic((Y), \
false: void, \
default: const void), \
void *: _Generic((Y), \
false: void *, \
default: const void *), \
char: _Generic((Y), \
false: char, \
default: const char), \
char *: _Generic((Y), \
false: char *, \
default: const char *), \
\
/* Signed */ \
signed char: _Generic((Y), \
false: signed char, \
default: const signed char), \
...
double _Imaginary: _Generic((Y), \
false: double _Imaginary, \
default: const double _Imaginary), \
double _Imaginary *: _Generic((Y), \
false: double _Imaginary *, \
default: const double _Imaginary *), \
long double _Imaginary: _Generic((Y), \
false: long double _Imaginary, \
default: const long double _Imaginary), \
long double _Imaginary *: _Generic((Y), \
false: long double _Imaginary *, \
default: const long double _Imaginary *) \
)

#define _(X, Y) __let(X, Y)

typedef union let
{
void;
char;
short;
int;
long;
float;
double;
} let;

int main(int argc, char **argv)
{
let stupidshit = _((char* )"lol this is so fuckign badf", true);

puts(stupidshit);

stupidshit = "wow the string changed this is l33t h4x";

puts(stupidshit);

let morestupidshit = _((int)69, false);

stupidshit = "ok lets end this abomination of a \"\"\"\"program\"\"\"\"";

puts(stupidshit);

return morestupidshit == 69 ? 0 : 1;
}

ghostbin.com/paste/za63q

int arr[10];
printf("%d", sizeof arr);
The fact that you don't realize that arrays aren't pointers is an example of why C's attempt to confuse the two concepts was misguided.

Try to sizeof an array. To your surprise it will work. This even works for dynamically allocated arrays.

Not when the array in question is a function parameter though, because in that case it decays to a pointer.

How was this not mentioned in any school/college book? Where exactly can I read more into this?

Because in my experience, college courses do not delve deep into the nitty gritty of a language.

See: any C or C++ class. Most C++ classes are especially bad with this, since from what I've seen they usually try to teach you C++ written in C style instead of idiomatic C++.

In addition to reading up on it, I'd advise you to toy with the language itself to test stuff like this out. repl.it can be useful for that if you're too lazy to cook up a simple Makefile.

Just google array decay.
The gist of it is this - whenever you do almost any operation to an array, it gets silently converted to a pointer to its first element.
int arr[10];
arr[5]; // arr gets converted to a pointer, incremented by 5 and then dereferenced
int arr2[5] = arr; // fails to compile, can't assign to an array
int *ptr = arr; // arr is implicitly converted to a pointer. Compiles fine.

Also like said, watch out for your function parameters.
void foo(int arr[10);
arr is not an array. It's a lie, plain and simple. arr is actually a pointer. Your sizeof operator will give the wrong result if you try to use it on arr. There's a good reason Torvalds says to never use arrays in your function parameter declarations

array of const char

Thank you for this, now what language made for and by white people can you recommend?

e

whats not to get, its basic as shit

print character new line, array of hello containing chars 'hello' so print element 1 on that array, which is e

To add to this:

>Your sizeof operator will give the wrong result if you try to use it on arr.
The actual size it will return is whatever the size of arr (which has decayed to a pointer!) is, so on x86-64 that would be 8.

Holy fuck that vector is so fucking terrible

You can most of the time mimic pointer behaviour but there's a reason you see &array[0] rather than array in most code

It really depends on what you want to get done. A language should be picked as a means to accomplish what you want to do, and not the other way around.

I personally really like idiomatic C++ because of how easy it is to declare intent with the tools that you're given (const, references, different types of smart pointers), along with the incredibly powerful combination of templates and function/operator overloading if used properly. However, I do think that it's a language that makes it very easy to shoot yourself in the foot and produce unreadable garbage. It's just very easy to create crap code without being aware of it.

I dislike C++ references personally but you have the right idea

"e" because adding is commutative.
You'd have to me dumb to think 1 + 2 isn't equal to 2 + 1.

How can one find 1["ab"] == 'b' logical ?
It's painful garbage syntax and the compiler shouldn't accept it.

because subscript notation in C is a leaky, leaky abstraction

Quoting the C11 standard draft
A postfix expression followed by an expression in square brackets [] is a subscripted
designation of an element of an array object. The definition of the subscript operator []
is that E1[E2] is identical to (*((E1)+(E2))). Because of the conversion rules that
apply to the binary + operator, if E1 is an array object (equivalently, a pointer to the
initial element of an array object) and E2 is an integer, E1[E2] designates the E2-th
element of E1 (counting from zero).

[] operator is two-way
it means arr[1] and [1]arr is the same

that's why

#include

typedef const void *fn;
typedef fn let;
typedef void *let_mut;

fn main(int argc, char **argv)
{
let wow = "this is fucking retarded";
let_mut wow2 = "wow this string will mutate";
let retarded = 1;

for (let_mut i = 0; i < 101; puts((int)i++ % 5 ? "" : "Buzz"))
printf("%i\r%s", i, (int)i % 3 ? "" : "Fizz");

printf("%s\n%s\n", wow, wow2);

wow2 = "wow this string mutates so fucking l33t";

printf("%s\nWill return: %d\n", wow2, retarded ? 0 : 1);

return retarded ? 0 : 1;
}

tl;dr: you're retarded

>C program
Sigsegv

Gosh C is confusing

I'm dev lead at a Perl house and I know my future junior devs are going to have just as much trouble understanding the idiosyncrasies of Perl as I do with C. I'm starting to understand the value of languages like Java and C#, which limit what idiots can do when they hack onto your projects. Its sad but true. Its hard to come by people who fully grasp any language, and when you do find them, they won't accept any position that isn't already leadership or leadership bound

C is a small and regular language, Perl on the other hand is full of different ways to achieve the same thing and bizarre syntax.
I can understand C being difficult, but I'm not sure why it might be confusing.

I'm not the smartest man and I haven't studied C as much as I'd need to to understand how to use it safely

Using it safely is a completely different story, yeah.

someone doesnt understand pointers

a[b] and b[a] are the same for all a,b.

Savage.

Pointers is OK, but this 10["charactersblabla"] notation is total cancer

>language quietlly inverts the conventional meaning of the subscript operator when provided garbage

Where is the not-retarded part?

OP it's already a month and you keep doing this shitty threads and posts on /dpt/. Don't you have something else to do?

int main()??

>Perl house
Is that a thing?

There are a couple of us left in populated areas. Its faster than both Python, PHP

kek

a[b] is equivalent to b[a]

const char* s = "hello"; // s = 0xdeadbeef
int hello = 1;

// hello[s] = 1[0xdeadbeef] = *(1 + 0xdeadbeef) = *(0xdeadbeef + 1) = 0xdeadbeef[1] = s[hello]
char e = hello[s];

Its big in Japan still, despite Ruby being a thing

a[b] = *(a+b) = *(b+a) = b[a]

e

Because array[I] = array+I = I[array]
Literally page 30 or so in k&r

hello["hello"] is the same as "hello"[hello]. It's just syntactic sugar for *("hello"
+ hello)

brainlets

shit, meant *(&"hello"[0] + hello)

>this is the only correct way
Nope. int main(void) is correct too.

Brainlet.

It may compile, but it's not correct.

...

Both of those styles are disgusting.

I too, prefer **argv, and it's too, valid (equivalent).

Doesn't change the fact that I'm right, and you're wrong. You just lost an internet argument, my friend.

x[y] is equivalent to *((x) + (y))
it's all pointers nigger