Is this even something people still know how to do?
All of your fancy languages are just telling the 1’s and 0’s when to exist. Why not tell them yourself directly?
Programming in Binary
Other urls found in this thread:
catb.org
en.wikipedia.org
cirosantilli.com
twitter.com
it's pretty much assembly but instructions are in binary
>why not tell them yourself directly
because we don't have to anymore and it's much less efficient in terms of IRL time and money
writing procedures in binary is like writing music by drawing the waveshape of the resulting audio. Also, portability is why we use higher level shit, even writting in assembly should be avoided at all costs
there is no way to write in binary dipshit
This, if you know how instructions are formatted on your processor then you can specify operations, read/write registers and such all in binary. There’s basically no reason to ever do this, though, aside from being autistic.
It would be like building a sandcastle using the end of a needle as a shovel, moving one grain of sand at a time. And you're not allowed to look at more than a few grains of sand at any given moment to see what the fuck you're actually doing. You would probably die before you made anything even remotely useful in any sense. But, yes, you could do it. Programs are just 1s and 0s.
I'm doing this since I'm writing my own compiler. But you said aside from being autistic so I don't know any other reason.
Retard alert
What’s wrong with writing your compiler in assembly or even C? I can’t think of any advantage to doing it in straight machine code since whatever you write in a higher level language will ultimately get compiled to machine code anyway. And it will be better and more optimized than anything you would ever manually come up with
I do this. In fact, I'm so awesome I write 8 bits at a time.
>tfw you still adress hardware cards using binary
my 26 year old millenial ass farts in your general direction
Because it takes A LOT longer. Why would you do it when a program can do it for you?
This is at close as it gets
catb.org
enjoyed that read thanks
I like that analogy
depends on the compiler and level of optimization you're using GCC definitely will optimize things differently than another implementation of a C compiler toolchain
>Binary is literally a structure of 1s and 0s
>Languages are interpreted
How do you interpret random 1s and 0s without a framework or structure, retard?
Same. We are babbies when compared to those programmers.
>he doesn't code in binary
> he doesn't dream in binary code
i'm a compiler engineer and i know about instruction encodings in order to write and improve backend optimizations.
this is incredible
>Languages are interpreted
?
Have you ever compiled anything
I mean.......... technically he is not wrong. The ALU does interpret machine code. But he is still an idiot.
Well written compilers will represent code in many different ways, usually they call these IR's (intermediate representations).
For instance a sophisticated compiler may have IR's in the following chronological order:
Plain text
Token stream
Raw AST
Desugared AST
Typed AST
Flattened AST (ANF or CPS or some combination)
SSA (single-static-assignment)
Abstract assembler (basically SSA with register allocations)
Target-specific assembler
Machine code (unlinked object file)
Executable (after linking)
Many of the steps go through multiple passes. Type checking may require extra steps, there might be different forms of SSA between optimizations, etc.
Basically, no well designed compiler outputs machine code directly. Some of these steps can be done by an external tool. GAS will handle the last three steps; compiling to LLVM will handle from SSA downwards; you could even treat C as your "flattened AST".
How exactly do you think the first assemblers were made? How could you be this stupid?
I might add that the only thing seperating a "compiler" from an "interpreter" (the common use of the terms) is usually the last 4, sometimes 5 steps. So there is really only a very thin line between the two. And interpreters like LuaJIT will do all of them.
>Why not tell them yourself directly?
Because it's very error prune, unefficient and a huge waste of time
No, that's stupid. People "programmed in binary" just long enough to write a hex editor, which they immediately used to write an assembler. They then started calling people like OP retards for not using tools properly.
I wrote a hello world application in straight binary years ago. It was a .COM executable though. Normal executables have too much header information that I didn't know how to reproduce. Basically to do this I had to locate a manual containing all the intel opcodes and their values and learn a little bit about assembly. I wrote the hello world application in assembly first, and then manually translated the assembly instructions into their binary opcode values.
alright you dumbfucks show me how you gonna program in binnary
really, the instruction pointer is the only thing that tells them when to exist
See
The ALU really doesn't interpret machine code, the controller unit does. Specifically, the decoder in the CU "interpretes" machine code.
no you dipshit show me EXACT tutorial and environment to code inbinary
Nobody ever used binary, except maybe for specifying certain values when they wanted to push bitflags or things like that to memory. Assembly or hex codes were used.
thats what im sayng whole thread and those degenerates called me retard. never seen anybody wrote program in binary. is not binary what compiler translates code in to?
writig in binary sounds like saying i made water from water
The cpu simply sees the instruction and does it. compilers change other languages into binary
First of all, let's get this clear: There is no difference between binary codes, hex codes and assembly codes, other than the way they are presented to the user.
Secondly: do not fool yourself into thinking the computer processes a bitstream -- it's actually a bytestream (or more than that, depending on bus width?).
Thirdly:
>is not binary what compiler translates code in to?
Binary is a certain way to represent numbers. This very post has a binary representation (several, actually, depending on locale settings etc), but that doesn't mean I can feed that into a processor and expect useful results. What you should be asking is: "Doesn't the compiler generate bytecodes?" Short answer: no. Long answer: It depends on the compiler. For example, when you write a Hello World program in C and compile it using gcc on GNU/Linux, you get a file that the GNU/Linux operating system can interpret, which is NOT composed (entirely) of bytecodes for your processor. The resulting file is very different from a file that will run "on the processor itself" (as in, a C program that is designed to run on a microcontroller without an operating system per se).
Disclaimer: I am not a CE so take what I said with a grain of salt.
>Is this even something people still know how to do?
I have learned how to but I don't recall all functions, it is meaningless.
It is useful when you program in hardware as well, but abstractions is very useful.
>Why not tell them yourself directly?
Because it would take more time for me to write, more time for you to read and therefore more time would be wasted.
But I have an application that can translate it for me, so why not use that?
en.wikipedia.org
Or do you mean specificically how to write the files?
Most tutorials revolve around assembler.
cirosantilli.com