Learning C in >current year

Is C still wortlh learning nowadays?

I'm studying CS but my uni's program only teaches you Python, Java and C++ so i figured I'd try to start learning C on my own over the winter break.

I already have a decent grasp on Python and Java and have always heard all the real enlightened big brain neProidians learn C to understand how a computer works beter since it's "close to the metal" unlike Python and Java but even starting out with Learn C the Hard Way, Zed Shaw says that C is a fundamentally flawed language and has many errors that later languages fixed into non-issues.

Should I keep going

Other urls found in this thread:

publications.gbdirect.co.uk/c_book/
github.com/nothings/stb/blob/master/stretchy_buffer.h
github.com/nothings/stb/blob/master/stb.h#L2981)
warp.povusers.org/grrr/HateC.html
cs.cmu.edu/~15122/schedule.shtml
cs.cmu.edu/~213/schedule.html
matt.sh/howto-c
cert.org/secure-coding/publications/books/cert-c-secure-coding-standard.cfm?
twitter.com/SFWRedditImages

it is a "flawed language" in the sense that its functions does not do boundary checking, instead it puts that burden on you, and that is pretty good when memory is scarce. you also find yourself implementing algorithms that are given to you in the form of a function, as is the case in C++, it is fun, not sure when time constraints are set though.

>C is a fundamentally flawed language

Such as? Give examples

Start by learning some ASM, on whatever architecture. The beauty of C comes with the knowledge of how it is implemented.

Like I said, I'm just learning and quoting what Zed Shaw says in the intro to the book.

>learns basics of le brainlet Uber for Houseplants language and pajeet verbose PooInLoo shitlang
>thinks he's ready to take on C

oh sweetie

AHAHHAHAHA OH NONONO AHAHAHAHA

>all the real enlightened big brain neProidians learn C to understand how a computer works

Objectively false. If you don't fuck with Assembly you don't know shit about how a computer works.

But he's an idiot. A miserable failure.

>not learning HolyC
I bet OP glows in the dark

you should be applying for internships, you retard. enjoy unemployment

Do you want to write libraries or software for embedded devices? If not, there isn't much practical use for C for you. It's a very comfy language though and it isn't as complicated as people make it out to be.

If you want to learn it, just start, man. Nobody is going to learn it for you.

pros of C:
>if you write a C library, it's incredibly usable
>use just about every library
>fast as fuck, with a few exceptions (char*)
>rough understanding of what's going on behind the scenes in higher level langs (learn ASM lol)
>painful simplicity

cons of C:
>doesn't actually do anything for you
>unsafe
>very little inference
>reinvent every wheel
>people forgot how to write C, and use dumb shit like include guards, macros for unneeded "generic" programming, fake OOP, and more

>Zed Shaw
Don't read this idiot's books. Yes, C is considered flawed relative to other systems languages, but it is sort of a lingua franca in this area. That said, Zed Shaw cannot into C.

The best books for C are Kernighan and Ritchie's The C Programming Language, and C Primer Plus. The former is literally by the original designers of C, the latter better covers the latest standardization of C. I recommend the latter, but most of Sup Forums has a hard on for the former. Don't read Zed Shaw.

>Dumb shit like include guards
This is mandatory in standard C. #pragma once, while supported by most compilers, is non-standard. Any future C compiler could be made that doesn't support it, but may compile code better than GCC and Clang, and thus become the new de-facto compiler. Nearly every software library written in C uses include guards, because this is the way you are expected to do things in C. Honestly, the worst people are the ones who don't use #include guards, or don't know how a linker works and #include their .c files.

I would definitely recommend learning C if you want to be a serious programmer. C is considered the lingua franca of programming languages. Pretty much any language has some way to interface with C. You'll learn more about what's going on behind the scenes aside from learning actual assembly (I'd also reccomend learning how to read some simple asm if you really want to know.) It's also a very simple language. The book "The C Programming Language" only takes 272 pages to describe the entire C standard. Even if you never end up using it, it doesn't take too long to pick up and gives you a lot of information. If you ever get into scientific computing you'll probably write prototypes in Python or something similar and then implement the actual code in C or FORTRAN to make it fast.

Do you think OP is incapable of doing both?

Yes, you should. C is still around 40+ years later for a good reason. Any flaws in the language are blemishes or consequences of using the wrong tool for the job - C fills it's role admirably well.

include guards are one of the most misunderstood and misused things by nu-C programmers. Within a library or program, headers including headers is both slower, and obfuscates the resulting program, making it harder to tell where things came from and to understand what needs what in order to function.
The best practice is to simply make note of dependencies at the top of each header, including things accordingly within your program, and to use include guards only on user-facing headers.

Yeah, that shit doesn't scale in large programs, and the decrease in compilation speed isn't significant enough to justify its use. Linux uses include guards everywhere, and so should you.

>macros for unneeded "generic" programming
what do you mean by this? is it "Template C" or something else? if its template C, explain why shouldnt you use it, aside from the fact that its a bit harder to start.
something as simple as sorting its alright to use void*, but for something a bit more complicated, like data structures (hash table, vector, w.e.), its much better to create them in template C instead of implementing yourself each time you need to contain different types.

still knows more about C than most of the C memers around here

Any opinions on this one? I have only read the part on pointers of this book (and got it in the first read), but I haven't read the rest. publications.gbdirect.co.uk/c_book/

read Modern C by Jens Gustedt
~400 pages on the latest standard (C11), not C89/ANSI C, nor C99
if you just want to get started, like in a day or two, then K&R is a good choice, but it has probs, like oversimplifying and overlooking things (including some magic sometimes)
after reading K&R and/or reading ~half of Modern C, try reading Object Oriented C (dont remember author) for a different look on things, making polymorphic functions, how to make "magic" work your way, and a few tricks
i say after because there are some bad habits you may catch from it, so dont take it TOO seriously, its not that great a book and its a bit old aswell
other than reading you should practice, think of something useful you could make, start with something simple, and build it as you learn more
an important note is, the C language itself is simple and easy to learn, hard to use in some bigger projects (because of reinventing the wheel like some user above said), but above all, if something doesnt work, theres a reason for it, and thats an opportunity to learn more about the computer and/or your OS

forgot to say ##[email protected] is a good place to ask questions and for resources

don't do what these fucks tell you to do

c is no longer relevant for 99.99% of jobs

if you want to learn c to fuck around on your free time, go ahead, but for work if you don't want to become a web monkey you'll end up working either with java or c#

If you read the OP at all, you would see that he is learning it in his free time.

yes, but he doesn't specify if he wants to learn just to know it or if he's considering it for a production environment

>if you don't want to become a web monkey
>Java or c#
Those are webmonkey languages.

Here's some that I use in industry that are not:
Ladder logic
Function diagram
State diagrams
All sorts of pseudo c for various devices
Vb
VB script
Mssql
Powershell
Oracle
C++
Classic ansi c
Python

We also use Java and c# but mostly for the business dicks who want kpis. All of our reporting interfaces are Java, HTML and c# depending on the platform.

Try again, little troll.

>Is C still wortlh learning?
Yes, but that doesn't mean it's usually worth using

>doesn't scale in large programs
>but include guards increase compilation time exponentially
pick one. I'd rather have a few more includes on every file. It's what NASA did, after all.

Sure, templates are useful, but C isn't about generic programming. It's about deeply specialized programming, where you write what you need to get the job done.

For example, let's say you read a list of words from a file. Most people would make a vector of strings, then push the words to it. In C, you could do that, but the specialized solution is just to read the file once, allocate, then read it again and copy in the strings.

The result is not necessarily faster, but it's a simpler set of tasks for the machine, and a transparent set of operations all the way down to the metal.

THIS.
I'm tired of monkeys coding in C. Do as everyone else and pick js ffs

a "specialized" version reading a file twice seems dumb to say the least (no offense). reading from disk is by far the worse/slowest part in the process you described
and talking about specializing, with templates you can get specialized code for whatever type you need! but the CPP does it in your stead, no need for manual work
on top of that, you get type checking (as best as C can do), instead of a generic void*, which you have no way of knowing what the original type is
ive seen some implementations of generic auto-growing arrays/vectors, and they all rely on the programmer to know stuff, which isnt bad per se, and sometimes it might even be a better option, but i dont like it, so i made my own
my favorite generic vector implementations are the STB ones (github.com/nothings/stb/blob/master/stretchy_buffer.h and github.com/nothings/stb/blob/master/stb.h#L2981) i think theyre really sexy

hi

aspnet core says hi

Rewrite yourself in Rust.

first: I'm not advocating genericism. I don't care about void* versus templates.

second: C isn't about speed. It's about transparency. By iterating twice, it becomes easy to reason about and work with the involved memory. C teaches you to see memory as memory, which is the root of performant code, even if it's not always an ideal solution.

>c is high performance because it makes memory access obvious

Mochi mochi, this is me, pointer aliasing.

Writing fast C code involves more of compiler theory than most people would like to admit.

Notice how I said the speed benefit of C is a change in perspective about memory, and not the fact that C itself is inherently fast.

warp.povusers.org/grrr/HateC.html

no

His opinions are valid. C does not help you write programs. C is not generic. In C, you don't get to work with the comfortable set of abstractions provided by vectors and hashmaps.
However, being able to understand the C way of doing things is a benefit to any programmer.
Sure, you can increase the size of your binary with a hash map. Or, you can just do a linear search, and be surprised when it's far faster until your scale increases.
Sure, you can write a concise solution to a problem in only a few lines of Python, but that's not what C is for.

So we can agree that C should stay in the classrooms.

Only worth learning if you plan on being an embedded software or OS developer.

I'm going to assume you're talking about avoiding spooky-effect-at-a-distance, which is of course a nice thing to have. Go has it too, so why not use go if you want strongly local semantics.

The trouble with C is that the memory model is probably the worst example in there when it comes transparent locality. For example aliasing:

x(int *a, char *b) { *a = 1; *b = 2; return *a; }
y(int *a, int *b) { *a = 1; *b = 2; return *a; }

With strict aliasing, these two functions may return 1 and 2 (the outcome is not actually defined), even if a and b point at the same location.

The language is plagued with shit like this, another one is alignment wrt unions and bit width integers.

Whether you like it or not C isn't ever leaving for good. If we can't even get rid of COBOL we're sure as hell not getting rid of C.

>mochi mochi
you mean もしもし(moshimoshi), もちもち(mochimochi) is smth like もち(mochi) (the food) in texture

what do you mean by compiler theory?

I know firsthand that classrooms easily fail at giving you any relevant C experience.
I like to use C sometimes because it keeps me in a certain zen-like mindset. I don't write fast C, I don't write complex or generic or OOP C. I just think about memory and write things that are simple for the system.
And of course, the topic is whether learning C is worth it, which I'd argue is true.

But yeah, I think that C is not an ideal language in practice. rust is a solid choice. C++ can suffice with the right mindset.

>bit width integers
i assume youre talking about bit fields? i had never thought of something about bit fields: what happens if the value you assign a bit field is out of the scope of the representable values of that bit field?
struct {unsigned char x : 1;} x = { .x = 2 };

whats the value of x.x?

Undefined. However, I believe you can expect it to be 0 in that case; the top and bottom bits of 2 are both 0.

good point
what about .x = -2 ? i think it makes more sense to start from the right (LSB), but on the other hand, struct fields are supposed to be placed in memory in the same order theyre declared, so from that point of view, starting on the left (MSB) makes more sense

strcpy
buffer overflow
segmentation fault
strings are literally arrays of chars

should i go on?

>Strings are literally arrays of characters
But, strings ARE literally just arrays of characters

I Know you get my point nigger, stop dodgin'

correction, in C, strings are arrays of chars (usually bytes), not characters
theres no native support for unicode for example

C is still extensively used in real-time and embedded applications

Strings are linked lists of characters

A, never listen to Zed Shaw. He's a tool, with basically the technical prowess of Steve Bahlmer, the tact of Theo Deraddit, and all the charms and humility of Lenart Pottering.

Basically, a loud mouthed troll who's full of himself.

B, you don't need to program in it regularly, but you should learn it. A whole lot of the three world we interact with as programmers is heavily based in it.

hello haskeller, how have you been?

The problem isn't that they're arrays of characters, it's that they're null-terminated. std::string is a fat pointer with SSO, which is safer, often far faster, and nicer to use. That being said, a string of known size is often comparable.

Well, plan9 C uses utf-8. Also, standard lib things like wchar.h give you...usable unicode facilities. I'd say a bigger problem is locales.

>However, being able to understand the C way of doing things is a benefit to any programmer.

Should I study C if I want to understand how computer programs work on a general level? If not then what should I focus on? Asking from security perspective and yes I'm a total noob, pls no bully guise

Pretty well. Learning Rust and It's very comfy.

I use C because I work in low latency telecoms software. It's definitely still got its place in certain industries.

C is like the mid-level way that things work, and can lend to understanding lower-level concepts. That being said, assembly is more revealing and could be a better place to go. Many people will tell you that learning x86 is a worthwhile first step to understanding security.

what i meant by no native unicode support is that the standard lib doesnt provide ways to iterate over unicode strings (no matter the encoding: UTF-8, 16 or 32)
if available they are usually part of the OS libraries (at least in Windows, dont actually know how it is on Linux/*BSD) or as external libraries
didnt know that about plan9, but isnt it just plain dead?

id say yes, because of the low level nature, but if you really wanna dig deep, you have to go deeper and learn ASM

>Rust is very comfy
agreed, although if youre a veteran Haskeller youll still miss some things

>increase compilation time exponentially
>exponentially
Bullshit. If that were the case, Linux would take years to compile.

Not being able to compose functions is my biggest pain.
>muh 100% point free programs

C is the only way, you will C.

Bit fields in and themselves are fine - bitfield integer representation is arch-specific, same as endianess and everything else partaining to memory model. We're not talking ANSI C, but "high level asm" C - it is presumed you already know the model when you're dealing with its specifics.

The issue with bitfields is again aliasing. In C, unions are a safe way of type punning memory representation, with the exception of bitfields. Some compilers do type pun only the bit width, some will pun the whole word the bitfield is snap aligned to, it's a mess and such code often needs to be compiled with weak aliasing flag (=slower code).

You need to ask yourself and/or learn what you find interesting to program.

C is not great for doing stuff like string manipulation. Sure, you could write some very fast code, but in my experience, my programs are usually waiting on diskIO or the network rather than handling strings. Even if its usually much easier to handle in python where the speed is largely not important.

That said, if you want to do stuff like program micro controllers or embedded systems to fly drive, beep about you'll probably need learn C and possibly assembly. I do embedded kernel programming all day. I rarely ever manipulate a string and I could care less about bounds checking. Bound checking won't do you much good when your booting the kernel anyways. Being able to very selectively control memory comes from is required.

The one problem I think people run into with C is that they don't really have the time to deeply understand it. There's not really that much syntax and things you can do compared to stuff like python or perl. In my experience to get the most from C you have to become an expert of the low level details of your platform.

Hence, you may not put the time into C to get the most out of it, and depending what you're doing there's a good chance it may not be the best tool for the job.

Regardless, I still like C the best of any language. The simplicity of its syntax is wonderful and at least you can always objdump your code to figure out what the hell is going on.

ikr procedure-lets swear by thisLastThing(anotherThing(thing(my_thing)))
and OOPlets by my_thing.thing().anotherThing().thisLastThing()
but nothing beats doManyThings = thisLastThing . anotherThing . thing

====================================================================

Step 1: Read K&R, realize the style of declarations used in it won't be used today but it still is the definitive resource on how to read a declaration/bracket order operations ect.

Step 2: Read these lecture notes, slides/additional resources cs.cmu.edu/~15122/schedule.shtml

Step 3: Watch these lectures, read the course book (click on 'old lectures') cs.cmu.edu/~213/schedule.html

Step 4: Read this modern C guide matt.sh/howto-c

Final step: Read this and use for reference cert.org/secure-coding/publications/books/cert-c-secure-coding-standard.cfm?

That's all you need. K&R will teach you what is undefined behavior. 15-122 will teach you to write safe programs and how to analyze existing programs. 15-213 will teach you what C looks like at the assembly level, stack frames, two's complement representation, floating point ect. That 'How to C in 2016' guide will teach you how modern C is written to avoid classic C problems like throwing around char's and ints. The CERT guide is a good desktop book to have around to make sure the shit you are writing cannot under any circumstances lead to undefined behavior. Whatever project you decide to contribute to after doing all this to obtain base competence in modern C programming, they will have a contributor's style guide which you must read like the kernel.org style guide or OpenBSD man style

====================================================================

At least there is no bullshit on the level of Car car = new Car();
One of few things I honestly prefer in Rust is it's snake_case convention, way more readable than camelCase imho.

>matt.sh/howto-c
>shilling for void func() {

Not doing function opening bracket on newline is for barbarians and infidels. Rationale:

>Heretic people all over the world have claimed that this inconsistency is ... well ... inconsistent, but all right-thinking people know that (a) K&R are right and (b) K&R are right

(theres a technical reason for this - so you can find function labels with grep/vi regex).

you dont have to learn c because its good its not,you have to learn so you learn how the machine works.for example if you ask someone who only ever learned java why does a succesful run of his program returns 0 even though the main function's return type is void he wouldnt be able to answer.or if you ask what is double.isNAN and positiveinfinity and negativeinifinity he probalby wouldnt know the answer.everything is built on c and c++ and not knowing them makes you a shittier programmer doesnt matter which language you use.

>and not knowing them makes you a shittier programmer
desu this is a common argument among lispfags "you will never use it, but knowing lisp will make you a better programmer".
what the fuck user.

I find Modern C incredibly dry for beginners.

I'm glad this pasta is still being passed around. Seems the most reasonable way to tackle this language as of today.

yeah i also prefer snake_case, but a nice thing is in Rust they use all of them for different purposes, and, even though its not mandatory, its advised to follow these conventions.
SCREAM_CASE for global constants, PascalCase for types and traits, snake_case for functions and variables
i really think its a nice language, too bad its been hit by the meme bat and its not taken as serious as it is
>sepples is all you need, Rust is for brainlets

same thing with Haskell, you learn a new way of thinking. youll never be the same after learning Lisp and/or Haskell! even though i write most of my projects in C, after learning Haskell everything became so much clearer and easier to think about! i also usually "implement" a protype program in Haskell to check the types match and i didnt completely fuck up somewhere
still in the process of learning Lisp though. ive wanted to learn it for a long time
also the "you will never use it" is ofc not true. there are many jobs for Lisp/Haskell programmers and there are many ways to make both interact with other languages, especially Lisp with C (see Chicken Scheme)

somewhat agree, but its straight to the point, no soothing the experience. if you keep up till the end, it only shows youre serious. thats why i like it

I found a lot (and I mean A LOT) of people I knew from Haskell communities learning Rust, which goes to show that people who do not fall for the meme and/or are not aware of it know that it is the future of programming. Not to mention that having a Haskell background makes you understand type theory shit without second thoughts.

>C is a fundamentally flawed language
No, C is a language that doesn't hold your hand. Just because it doesn't spit errors at you because you're too stupid to bounds check or manage your memory or avoid undefined behavior doesn't make it the languages fault. C is a pure language, perfect in its simplicity and small core. Calling it flawed is like complaining that you sliced your fingers off using a mandolin. Its not the fucking mandolins fault.

>my_thing
>snake case
Triggered

It's only a flawed language for brainlets

it's a bit like learning Latin nowadays
it's not used in itself as much as it used to be (even pure C things these days tend to use a fuckton of libraries), but it's important to linguistic history, and learning it will give you insight into several other languages

I think calling it "fundamentally" flawed is definitely a stretch, I could see why some people just don't like it.

Even then most issues I've had with it are easy solvable via a debugger or google search

>buffer overflow
>segmentation fault
hard to avoid in a language that's so close to the metal, which isn't a failing so much as it's a design choice
>strings are literally arrays of chars
it's annoying but it's not a big deal, and the most recent standard does have a standard String data structure
>strcpy
I'll give you that one, I don't remember the details but holy shit, I know it by reputation, and it's got a reputation up there with "eval()" as far as ways to fuck things up

shouldn't it be result_of_many_things = thing.another_thing.this_last_thing, replacing any of those with thing() if need be?

no step on snek case

C is pretty much exclusively used in my field of work. Embedded systems programming/reverse engineering.

in the near future, sure, but in the not so near future, i think something like Lisp/Haskell should be the norm, looks-wise

>most recent standard does have a standard String data structure
what? what language are you talking about?

no, its just like function composition in math:
(f . g)(x) = f(g(x))

so
thisLastThing(anotherThing(thing(my_thing))) = (thisLastThing . anotherThing . thing)(my_thing)

hence my def of doManyThings:
-- because `my_thing` is on both sides of the = it can be ommitted
-- and both of these lines mean/do the same
doManyThings my_thing = thisLastThing . anotherThing . thing my_thing
doManyThings = thisLastThing . anotherThing . thing
-- the 1st one is called "point wise" and the 2nd "point free"
-- because there are no "variables", only function composition

this is valid Haskell code (the other up there is too, but only for unary functions, because Haskell doesnt need parens everywhere)
i guess the confusion comes from natural language, since we read from left to right, but functions are applied right to left
that is the biggest reason ppl like OOP method notation, they can read the process from left to right just like they would many natural languages

>fundamentally flawed language and has many errors
Okay, what y'all youngins need to understand is what programming looked like before C, and what it looked like with C.
Look at pic related, look at this beautiful thing. This is a Data General Nova, first released in 1969, and my grandfather had his employers buy him one, because they were sick of him hogging the mainframe for weeks on end.
I think at first they got him a PDP-8, but he got sick of that thing QUICK, he needed a REAL machine.
He wasn't gonna settle for a Nova 1200, nuh, he needed the fastest, so he got a Nova 800.
800ns cycle time, 300kHz CPU, 16k of memory (8k words), and by the heavens, a 1.4MB hard disk.
Now, in 1972, what the hell are you gonna do with a 1.4MB hard disk? This wasn't a mainframe, this was a personal minicomputer!
Well, first he'd sit down and read through hundreds of pages of technical documents to understand the architecture, then he'd play around keying stuff into the front panel, then he'd hook up a punch card reader and start writing programs.


He would program everything in machine code, he wrote his own disk drivers, task scheduler, everything. He'd keep hand drawn spreadsheets to track where he'd put stuff in memory, so he could remember for later if something needed to be accessed. HE WAS MANUALLY PUNCHING IN MEMORY LOCATIONS FROM A HAND DRAWN CHART.
What he did with it, well, he spent months running simulations to work out why NTSC was shifting colours when they broadcast it over around 800 km, tasked with coming up with a solution, coining the phrase "Never The Same colour Twice", and saying "Know what? That PAL thing looks nice, let's just use that instead".
Cont

Cont.
He decided that maybe computers could be used to sharpen film pictures, but first he needed a digital copy of an image. So he hooked up a TV camera, had the "nerds" make an interface box for him so he could plug it into his Nova, then wrote all the software needed to read from an analog TV camera, write to disk, using a 300kHz processor in 16k of memory. One scan, producing an image with about a million points (1 megapixel), IN 1972, took about 12 hours to run, so he'd focus the camera on a picture, start the scan, go home, and when he got back to work the next day, it'd be done.
This was using stuff he'd just found lying around his lab.

But that's not the crazy part, he wasn't trying to make a scanner or camera, he'd already seen NASA do that over in the US years before, he wanted to make an image sharpener.
So, in pure machine code, by hand, with no memory management provided whatsoever, he started writing a program, by means of punching holes in cards, that would traverse the entire image, section by section, recognising patterns, and modifying the image appropriately. To run this on a single color channel of an image took 1 week, and you'd need to do all three channels separately. Surprise surprise, he mostly just did it all in grayscale.
It worked, it worked quite well, and you damn well bet K*dak patented the heck out of it, but dedicating a $8,000 minicomputer ($47,000 adjusted), FOR THREE WEEKS, just to sharpen an image, in 1972 just wasn't viable.
Plus there was no way of putting the image back on film. It could be displayed on a persistent CRT, well, an old oscilloscope Gramps found (a brand new oscilloscope gramps pinched from the guys next door), but you'd have to zoom in on a section to see the quality benefit, and it was still all green.
Cont

Damn, grandpappy sounds like a real big brain nibba.

Cont

Now, this is the sorta thing people where doing on these machines in the early 70's. All memory was handled by manually hard coding addresses, all for the specific machine they had in front of them. You'd be re-writing everything for a different variation of the same model of machine, just because of timing (Unless you were just running calculations and the memory layout was the same, but good luck with that teletype if your baud rate's now 23.5% faster. Where's that variable that holds the timing? Oh wait that doesn't exist yet, the whole thing's coded around a fixed rate)

EVERYTHING you did was like this, even (especially) disk access. You working on a mainframe and wanted to store some data, you'd manually type in the cylinder and secors you wanted to read or write from, and every now and then, a few minutes after you wrote something, you'd hear someone else across the room yell at the top of their lungs because you just wrote over one of their databases.
No filesystem, NOT EVEN FILES.
There's just a list of employee's on the chalkboard and the ranges they're allocated. If you were being real fancy, you'd write this as a document to the first few sectors of the disk, which would actually work if everyone's on a timeshare system and can just load up a text viewer, but when you've already loaded a program, chalkboard it is. Unless you're being cheeky and think you can remember your sectors off by heart, then just mash 'em in dude, she'll be roit.

So this is how everything worked and everyone got on with it pretty well, not many people were complaining.

But then a couple guys over in the US decided they'd had enough of re-writing programs, nearly copying punchcard to punchcard verbatim spare memory addresses, so they decided to do a little thing called permanently changing what it meant to program and use a computer for the rest of history.
Cont

K*dak only kept the best.
Surprising how much he got around to, just as a side note, he developed some architecture software for them which they used to build a new factory, basically CAD software, so he moved to New Zealand for a few years to oversee construction (this was in 1960 to 1964 if my dates are correct).
I don't know who's fault it was, but the designer left out the space between the walls in the design, they didn't realize till well after construction had started, it was a big fun mess and many laughs were had.
>"Well Fr*d, I guess there's always gonna be problems when you play around with new tech, using computers for the sake of using computers might have been a bit silly and pointless here (Keep in mind they literally invented CAD; they fixed the design and sent a reel to reel tape over to the US which they used to build multiple new factories, literally copy and pasting entire buildings, the ENTIRE BUILDING PLAN AND ALL INFO, LITERALLY THE ONLY THING BEING SENT OVERSEAS, NO PAPERWORK, JUST A TAPE, AND THEY USED THAT TAPE TO MAKE A WHOLE BUILDING), I guess there's gonna be the occasional couple million dollar mistake here and there lol, anyway building's done, here, have a raise and go home"
His job was literally to do whatever he wanted, so that's why he ended up playing around with image sharpening, every now and then a new boss would show up and go "Hey, what's this guy on our payroll without a job description doing, and why is he earning more than me?". Some problem would crop up in the next month or so, he'd save the day, and the boss/new manager would make sure no-one bothered him or upset him in the slightest.
Still, I'm sure they were SOMEWHAT mad at times, once he was developing a new glue for their slides, because they couldn't import the toxic chemicals from the US they'd been using up till that point anymore. Cont.

Glue story cont.
Long story short he wrapped a 1 foot in diameter, 20 foot long steel roller in miles of cardboard and his new glue, expecting it to come loose when he applied some heat, but it stuck like a rock and they had to chuck this massive, precision engineered roller into the trash and make a new one.
(Happy end, eventually he DID get the glue working, and every slide in the asia pacific region was using his special blend for decades after)

Why C is brilliant, cont.
So these guys at MIT decided "Know what? We should make a light abstraction layer to machine code, so rather than writing your programs for just one machine, rather than re-writing damn near an entire operating system for every single task you wanted to do, what if you could write a generic program in human readable, english ascii, and the only thing you'd need to write per machine was a compiler, to take the ascii instructions and turn it into machine code?
What if rather than keeping 20 or so punchcards you'd slot in halfway through your program if you needed disk access, and another 20 for teletype, and another 20 for every other feature, what if we could write an ascii library of all these functions and you could bundle it into your program if you needed it? What if we could pre-compile them, so they could just sit ready on your machine for whenever you needed them, no need to even build them into every program?
What if, rather than manually typing in memory addresses, you could store stuff in variables that the rest of the program can access BY ENGLISH NAME? Or, when you need to, store memory addresses in variables, which you could still access by name? What if you automated all this, so the program itself allocated, read, and wrote memory addresses to the variables, so no programmer would ever need to do it themselves?
CONT

So, for a machine, all you'd need to make is a compiler, which you could do on another machine using ASCII and compiling it all there, libraries for that specific machine, written in ASCII, that you could either dynamically build into your program, or just precompile and store somewhere if you ever needed it, and literally everything else is just portable code you could move from machine to machine.

THIS is the level we're working at, OP. For some newage twirp to say it's a "fundamentally flawed language and has many errors" is just mentally painful. By errors he means "It doesn't do this thing we have space to run now, it not doing something that other things do is a FLAW and an ERROR"
We went from "Manually type in the address of the thing you want, or maybe offset stuff from an address if you really want, either way you're keeping a spreadsheet full of hex codes on your desk if you like it or not" to "Lol here it all is, named and everything, no need to know the address, you can simply copy the address of a variable to another variable and use the second to access the first, if something out of scope wants direct access to the first, just give it the second and it can do what the fuck it wants, go wild with math if you like but don't be a twat and TRY to keep track of where everything is"
Do you even begin to understand how much of a DREAM this shit was to programmers? It's all these halfwit newbies that think the compiler should keep everything safe for them no matter what that look at C and go "Now, this language so utterly pure and without clutter, it's FUNDAMENTALLY FLAWED and HAS MANY ERRORS because IT'S NOT FULL OF SHIT THAT DOES EVERYTHING FOR ME, IT DOESN'T GET IN THE WAY, IT DOESN'T STOP ME FROM DOING WHAT I WANT, IT EXPECTS ME TO KNOW WHAT I'M DOING, IT'S SO GARBAGE BECAUSE I'M TO FUCKING DUMB TO FOLLOW THE SIMPLEST RULEBOOK IN PROGRAMMING"

YES YOU HAVE TO KEEP A VAGUE IDEA OF HOW YOUR MEMORY WORKS, BUT C DOES SO BLOODY MUCH FOR YOU THAT ANYTHING MORE IT LITERALLY "BABBIE FIRST DRAG AND DROP HOCKEY GAME"
IT'S NOT EVEN MANAGING MEMORY, IT'S SITTING IN YOUR COMFY OFFICE TELLING THE PR GUY TO GET RID OF DEPARTMENTS YOU DON'T NEED ANYMORE, AND ADDRESSING EMPLOYEES BY THEIR BADGE NUMBER, NOT THEIR NAME, GENDER, WHAT OR WHO THEY ARE, JUST A SIMPLE NUMBER, AND PEOPLE STILL MANAGE TO SCREW IT ALL UP AND SAY "IT'S TOO HARD"

C89 or C99? I hate the whole declaring the iterator outside loops in C89. Should I just get used to it?

Ding ding ding!

>dumb shit like include guards
include guards just allow a header to include its dependencies without fucking up the invoking module.
Modern compilers support #pragma once, which is better, but either way multiple header inclusion should not break your build.
Having used Ada, I can definitely say that this is one way that C/C++ is inferior; having the header file be liked to the compilation unit is nice and fixed a lot of things

>fake OOP
You mean vtables and inheritance? They work exactly as done in C++ except they're explicit. If you work in any large codebase, interfaces are going to be helpful somewhere.

>reinvent every wheel
I haven't seen a huge amount of things that fit this. If it's not something you can get a library for, it's probably not worth the effort of integrating something else.

>macros for unneeded "generic" programming
yeah, this makes me want to kill myself. Microsoft is probably the worst offender, with COM and ATL.

use c99 unless you need to support a compiler that doesn't support c99. Other than MSVC, I'm not aware of one that doesn't support c99 and you shouldn't support MSVC so the choice is clear.