Does anyone here actually know C

Does anyone here actually know C

a b c d e f h g i j k l m n o p q r s t u v w x y z

me

[spoiler]
int
float
char
array of char
[/spoiler]

WRONG

What's wrong? I noticed at the last moment that the fourth should be array of const char, is there anything else wrong though?

int and float are correct, others are wrong

I know C but haven't seen him in a while. Can I take a message?

number, number, string, string
c is outdated trash

Int float char char

Bit it's right you idiot. And yes, I know C. However I generally use stdint where I can for the sake of clarity.

int, float, int, char[2]

int
float
int
const char[2]

depends on what you mean by know

I've written a lot of C, and I understand how to use it in practice

That doesn't necessarily mean I know every detail of the language by heart, and for any sufficiently tricky question I'd almost surely have to open up the spec

String literals have type char[], not const char[] but modifying the contents is UB.

Oh also, my answer to the pic:

>int
>float
>char
>const char *

int, float, int (char in c++) and char[2]

Why is '9' an int? I know int and char can be treated the same way and all of that, but isn't the single quotation marks meant for the char type?

tfw intentionally unleanred it.

Because it's c and the language is retarded.
character literals are int.
They are char in C++.

Don't know the reasoning behind it but in C a character literal is an int. Try sizeof('a'). In C++ it's a char.

const int
const float
const char
char* pointing to rodata

Int, float, int, void*

BAIT

>I know C
>However
found the webshit

>and the language is retarded
Except it's supposed to be low level and to a computer it's all the same shit. That's how it's supposed to be.

You're retarded.

mind=blown

So the correct answer is:

>int
>float
>int
>char *

?

>hurr durr low level durr hurr hurr
and that makes it impossible to make character literals be ‘char’ instead of ‘int’, gotcha

>depends on what you mean by know
so you don't
which proves that you don't

char *
char[3]. Try sizeof.

nope

char[2]

But it still doesn't make sense, no sane person programming in machine code would use 4 bytes to store an ASCII character, so why does a C compiler do so?

>C
>types
Just bytes, all of these.

>an ASCII character
you're an idiot

>so you don't
according to your definition of “know”, I don't

>I don't
it shows

Ah, okay. I tried figuring it out by doing

foo.c: In function ‘main’:
foo.c:4:17: error: incompatible types when initializing type ‘float’ using type ‘char *’
float foo = "bar";


But I guess the [4] was already dropped at that point, thus making it only ‘char *’ in the error message?

It's ironic how a language with such a wildly confusing type system is the one that has the least mechanisms of letting users figure out the type of some expression.

Question: Do you believe your definition of words are objective and universe?

The default chars in C are ASCII. Unicode is wchar_t. And a char is 8 bits on many platforms, that's not enough to store non-ASCII characters.

>wildly confusing type system
you're literally retarded

>But I guess the [4] was already dropped at that point, thus making it only ‘char *’ in the error message?
Seems that way. I guess that at least for type checking, the compiler doesn't need to care about the differences between arrays and pointers.

depends on the definition of "believe", "definition", "words", "objective" and "universe"

Bro do you even? Name a type system simpler than C.

int, float, char, const char*

>The default chars in C are ASCII
wrong
>Unicode is wchar_t
wrong again

Typeless.

>I'm just another retard

>he thinks that's simpler
Oh dear

Says the guy posting in a thread proving that apparently, nobody understands C's type system.

Ask the same about Haskell, for example, and the answers will all be obvious and clear, which partly has to do with the fact that the Haskell type system is sane and consistent, and partly has to do with the fact that GHC will let you query the type of any expression

Haskell. See

Simple!=easy.

My point exactly.

>FP
>simple
>mfw everyone here is Wesley Crusher

> int, float, int, char[2]

Easy, next question?

int, float, char, const char*. It's not an array type since I can assign strings of other length later.

Then did you confuse typeless with dynamic typing or something? One type is always simpler than system of types.

>It's not an array type since I can assign strings of other length later

Technically they're all const. They're literals.

Dynamic typing is also an example of a one-type system (at least as far as the static semantics are concerned)

Try to implement a typeless language and then tell me about it

{'0',0} is char[2], "0" is not.

Brainfuck.

>nobody understands
nah, only braindead shits like you have a difficulty understanding C's type system

"0\n" is

The one in Commodore BASIC, brainfuck, or asm.

Dynamic typing has types and they can be checked against. It's not typeless.

>char[2], "0" is not
sure it is
no, that char[3]

Sure is simple

Yes and I use it for embedded systems.

...

I don't know how to write code. Well, maybe in bash/zsh or some kiddie shit like that.

>no, that char[3]
Eh, I meant "0\0"

int
float
int
const char *

>asking someone to define a datatype by the thing surrounding the data itself

this is C, not Ruby

int
float (though it wouldn't work without knowing what "f" is)
char
char

dont even pull the "array" bullshit there's only one element in all of those so there's literally zero point in declaring an array

>sure it is
Nah. String literal of different length are equivalent, arrays of different length aren't.

Wtf

At the end of the day, they're all just binary values ranging from 8-64 bits.

HAVE YOU FAGGOTS EVER COMPLETED A SINGLE C PROJECT THAT'S NOT A FIZZ BUZZ, DO YOU HAVE A JOB OR YOU JUST LIKE TO FUCKING COMPLAIN ABOUT IRRELEVANT FUCKING THINGS.

YEAH, I THOUGHT SO... FUCKING KILL YOURSELVES FAT NEETS.

Yeah, I also did Fibonacci... non-recursively!

what part you don't understand?
still char[3]
wrong
wrong
you're literally mentally ill
false

OOOooo....ooh

>YOU JUST LIKE TO FUCKING COMPLAIN ABOUT IRRELEVANT FUCKING THINGS.
Welcome to the internet, friend.

#Define c_is_for_cookie_and_that_is_good_enough_for_me cookie_monster;

const unsigned long long
const unsigned long float
const unsigned char
const unsigned char*

>upper case define
>semicolon after a define

>#Define c_is_for_cookie_and_that_is_good_enough_for_me cookie_monster;
and even for a such simple thing, you managed to fuck it up, twice

>false

Ints and floats are generally 32 bits, char is almost universally 8 bits, and a pointer is 32 or 64 bits on modern CPUs.

>tfw big dick

Geez so harsh. I only started 5ed learning it like 3 hours ago.

>int literal
>float literal
>char literal
>c-style tring literal

>swans are generally white
ok

I know that feel.

>not types

>a retarded analogy
not him but, you are a tard go away

There was some HP Calculator architecture with a 16 bit char, IIRC.

long double was 80 bit on some x86 compilers, before everyone abandoned the 387 instruction set. A lot of compilers also have the __m128 and __m256 types now.

Well, “braindead” implies some definition of intelligence, and for the concept of intelligence it comes down to what kind of skills you're better at. C's typesystem as a whole is pretty inconsistent, arbitrary and understanding it mostly comes down to memorizing the specification. Haskell's typesystem as a whole is consistent and generated by a very simple set of rules, so understanding it mostly comes down to logical thinking

.Some people are better at logical thinking and bad at arbitrary memorization. These people tend to be good at maths and bad at foreign languages, for example.

Other people are better at arbitrary memorization and worse at logical thinking. I know from experience that I tend to fall into the former category, so it doesn't surprise me that I can't intuitively understand C's type system without having memorized the specification, while something like Haskell or Agda come as a second nature to me.

But he's right. Assuming that ints are 32 bits is idiotic.

I think there's maybe 3 actual programmers ITT, 2 of which have given up by now.

why do hasklelfags have to shit up every thread

Here's a radical idea. Maybe you learn the language when you actually use it?