X86 was a mistake

x86 was a mistake
C was a mistake

prove me wrong
you literally can't prove me wrong and i 100% mean it

Attached: 1521908657125.jpg (500x474, 72K)

Other urls found in this thread:

pastebin.com/UAQaWuWG
youtu.be/QB-pd1fdXSk
mellanox.com/blog/2017/12/earth-shattering-io-performance-ibm-power9-servers/
datacenterknowledge.com/design/ibm-designs-performance-beast-ai
anandtech.com/show/10435/assessing-ibms-power8-part-1
yarchive.net/comp/ibm_pc_8088.html
raptorcs.com/
twitter.com/SFWRedditImages

x86 sucks its a huge mess

but you can't prove C is bad, the only could be evil are macros and they are powerful if you know how to use them wise.

vocaloid was a mistake, but I fucking love it anyway.

>but you can't prove C is bad
Literally the reason we have so many fucking vulnerabilities everywhere

memelang/10

what you prefer to x86?

python is pretty sweet tho

you can fix software issues. problems arise when there are issues in hardware, my dude. a tool is only as good as its user.

>x86 was a mistake
It is shit, yet I think nothing better would have happened.

People just want functions, they don't really care to demand things are "proper" until much later. And by "they" I mean a fraction of the users that has some brains and needs; the rest actually will prefer everything remain the same at that point.

Also see Python for example - from the standard library up it's an absolute mess and the smallest changes from 2->3 take decades.

>what you prefer to x86?
literally anything else
>python is pretty sweet tho
what
are you comparing python to C

If we had used x86 for about a decade and then it would faded into a obscurity it wouldn't have been so bad. It's a shame that the legacy compatibility demanded were so strong. It's what's caused the whole mess, x86 was even kind of backwards compatible to x80/x85 when it was invented. Intel wanted a clean design but chose to keep computer homebrew hobbyists happy instead for sake of userbase.

>pytard detected.

Yeah Python is better than C. Roughly the same speed and 1/5th as many lines of code needed to do anything. also security is better.

>same speed
Wow, an idiot on Sup Forums. Huge surprise.

>Roughly the same speed
what the fuck is this retard saying

C is lowlang
python is a high level lang

Not seeing any other CPUs with comparable single-thread performance. C really was a mistake though. There were better languages even before it.

>Roughly the same speed
you wish

Whats wrong with x86?

Attached: 1505590467764.jpg (400x381, 58K)

>Roughly the same speed
Only when all the non-trivial parts of the program are done in libraries written in, you guessed it, C.

Yeah. PyPy is now

It's better in every way so animetard hates it.

shits itself because muh legacy support
flaws
sec risks
ugly overcomplicated instruction structure

You are failure

>less than half the speed
>this is ok
go back to drinking starbucks and making horribly optimized javascript programs
faggot

It's enormously complicated, increasing the complexity of compilers, assemblers, debuggers, hardware implementations and everything else, also increasing the likelihood of some obscure exploit like the APIC exploit we saw with Sandy Bridge and earlier. It's so complicated in fact that they've given up trying to implement it literally in hardware and most x86 architectures now decompose CISC x86 instructions into collections of RISC "micro-ops" for the backend to actually execute. It's also proprietary which puts us in a position where the manufacture of desktop/server processors is basically under the complete control of just two companies with fuck all chance for anyone else to introduce further competition.

>PyPy
it's trash

1/2 the speed as in 2x slower, but this is just a fact.

Attached: RIP C.jpg (750x750, 91K)

>prove me wrong
Prove yourself right.

literally read the thread.

>I mean a fraction of the users that has some brains and needs
Few, maybe 5% of the anti-x86 crowd fits that description, the majority are just pseudo-intellectual contrarians and phone-tards with platform-agnostic use cases regurgitating marketer-speak and talking points from the '90s they barely understand, usually pushing for an equally shitty alternative like ARM that is just as old, "bloated" and "inelegant."

>Yeah Python is better than C
Python is a fucking glue language, you only write high level APIs on it, C,C++, Fortran, etc do the hard job and you supply a Python wrapper for the brainlets thats how it works.

ok write a bootloader for X86 in python without using C

Write a bootloader for x86 without using any assembly.

>Roughly the same speed
You're stupid! (it is worth repeating it)

C its converted into assembly so its impossible.

can't argue with facts, so there's nothing to prove wrong:
pastebin.com/UAQaWuWG

>you can fix software issues.
i think it's pretty clear now that we can't
>a tool is only as good as its user.
lmao

python is javascript 6 with forced identation.

>i think it's pretty clear now that we can't
how so?

The fastest stuff for python just turns it into C

>it into C
and Fortran or C++

Python is converted to C in PyPy.

Checkmate atheist.

C has its place but way, way too much is written in C that should have been written in something safer.

Like Python!

(running on PyPy)

>Python is converted to C in PyPy.

Thats not the point here moron, we need unsafe memory operations for developing fast and small firmware. Programming is not just about your fizzbuzz.

The gnu autist in me says guile

Ada and Haskell are the way to go.

Attached: 1507889043954.jpg (490x441, 15K)

>It is shit, yet I think nothing better would have happened.
the most likely other option back when this was up for grabs would have been 68k-based (you know, like the Mac, the ST, the Amiga, Sun workstations, etc), and that's a drastically better architecture

>you can fix software issues
you can also write web apps and browsers using pure assembly.

Just because you can do something doesn't mean it isn't a retarded waste of effort. Preventing issues should be the goal, otherwise we're doomed to constantly spinning our wheels over the same dumb shit over and over again. C helped prevent issues prevalent when dealing with pure fortran and assembly, and now newer languages can help prevent dealing with issues prevalent with writing pure C. That's pretty much the whole point of abstraction.

You don't need unsafe memory operations.

You just have to use them because that's how you interface with the hardware.

But that doesn't mean Python wouldn't have made a better interface if you could have used it.

unironically this. There's nothing C does that Ada can't do better.

Haskell is on an entire league of its own and practically unmatched for making highly maintainable software, but at least Ada would be less foreign to all the pajeets out there while still being a noticeable improvement.

Pascal and Ada are for retards.

exactly. doesn't mean c is bad as implied.

what 0-day sqlite vulnerabilities are you holding onto?

being a brainlet who could only learn

#fizzbuzz
import UniversalStackOverFlowSolver as brainlet

brainlet.Ask("Please I need a fizzbuzz program, thank you guys, kiss, kiss").execute();


its his vulnerability

#Say hello 10 times
#author: Taylor Swift
#import DoTenTimes
#import SayHello

DoTenTimes(SayHello)

>x86 was even kind of backwards compatible to x80/x85 when it was invented. Intel wanted a clean design but chose to keep computer homebrew hobbyists happy instead for sake of userbase.
On the other hand, Motorola opted not to make the 68000 backwards compatible with the 6800.

It's based on some extremely clumsy, antiquated 1970s design features, one of the more notable ones being port-based I/O.

Actually, even some 8-bit CPUs like the 6502 had a cleaner, more elegant architecture/instruction set.

There are tons of better options over x86 we had been cucked by cuckntel.

>x86 sucks.

Would you rather be using RISC or something for trivial tasks and coding? You are not THAT retarded are you?

MOV EAX,[EBX]
SHL EAX,[34F5h]
MOV EDI,34h
MOV ECX,0Eh
RPNZ SCASB

And so forth. An instruction set like this is a mess.

Especially the indirect indexed addressing, that's bullshit.

ROL EAX,[EDI+12]

Which means "rotate EAX to the left the number of times contained in the memory location referenced in EDI plus 12 bytes ahead".

Write a bootloader by carving the 1's and 0's onto a stone tablet.

Checkmate, atheists!

C is the best shit ever conceived you massive faggot. Give me something that is as fast and doesn't take an eternity to write.

they did not know better at the time

>x86 was a mistake
Nothing wrong here.
>C was a mistake
Did pajeet the webdev lose his job today?

Writing an x86 bootloader also exposes the user to the deliciously antiquated 16-bit real mode with 64k segments, until you can throw the CPU into long mode.

Ada is the preferred language for mission critical software in a lot of industries (but mostly the government/military) because of its guarantees about code correctness:
youtu.be/QB-pd1fdXSk

>he doesn't know about the POWER architecture

mellanox.com/blog/2017/12/earth-shattering-io-performance-ibm-power9-servers/

datacenterknowledge.com/design/ibm-designs-performance-beast-ai

anandtech.com/show/10435/assessing-ibms-power8-part-1

Attached: file.jpg (1224x1445, 247K)

If the IBM PC had come out a year later, a 680x0 architecture would have been viable but in 1980-81, Motorola did not yet have production units or all of the supporting chips available yet, so the 8086 was the only option.

I don't need to, the burden of proof is on (You)

>x86 was a mistake
Hey, a reasonable post on Sup Forums for on-
>C was a mistake
delet this

POWER has been irrelevant outside of vendor-locked legacy shops for over 20 years

IIRC IBM also had a closer relationship with Intel than they did with Motorola.

And actually, early IBM plans for a personal computer entailed using an 8-bit Z80 architecture but Bill Gates said that 16-bit CPUs were the future and why not just use that instead?

The 6800 was never all that popular due to its price, I think the installed user base of 8080 machines was bigger so compatibility was a higher priority.

There are three problems with ANSI C. All of them are fixed in C99.
1. You cannot define integers in for loops - this is fixed in C99 but I think even most pre-C99 compilers support it.
2. The types are not standard sizes. A char is always one byte, but one byte is not always 8 bits - however there is the CHAR_BIT. An int is 2 bytes on many systems and 4 bytes on many others. Id you truly want to be portable, you have to check the values in limits.h. C99 has stdint.h at least. You also kind of need a hack to portably tell if a processor is big endian or little endian.

yarchive.net/comp/ibm_pc_8088.html
There was a little more to it, to paraphrase from that. A 68000-based PC would have been more expensive and complex to build, the software tools weren't as good from the perspective of the engineering team, and they would probably have had to depend on another division within IBM to deal with Motorola.

8-bit chips were just plain stifling, other 16-bit options like the Z8000 were needlessly incompatible garbage with no future, and an in-house IBM solution was both expensive and stifling to developers due to high entry cost. The 8088 had its flaws like anything else, but it was just right for the time.

#include

uint64/32/16/8_t

Macros lead to 90s era Windows programming in C/C++. Fuck a textual preprocessor. Give me AST-based macros which can be typechecked!

>8-bit chips were just plain stifling
Uh...yeah. That's why when they first met with Microsoft that Bill Gates said there was no reason to limit themselves to an 8-bit CPU.

In an interview he gave in 1982, he talked about all the advantages that the IBM PC had over 8-bit machines, especially in terms of how much more sophisticated software could be.

>not an argument

Attached: file.png (500x534, 54K)

The 68000 wasn't cost-effective for a consumer PC until 1984 when the Mac came out.

What alternatives are there to x86?

Lotus 123 was the first piece of software that really demonstrated the greater power of a 16-bit machine over an 8-bit one.

but Bill Gates knows a shit about computers , she was and he is a business man.

see also: raptorcs.com/

literally a non-issue
after proper CFI gets widely adopted, the only thing left will be data-only exploits

One of the big advantages of the x86 was that Intel had the 8088, a "lite" 8086 with an 8-bit data bus which cost less and allowed the use of readily available 8-bit supporting chips.

The 8086 copied the Z80 architecture of having way too many instructions for any and all conceivable uses when some of them are rarely used or needed.

explain how C was a mistake

that's not how you say "mediocre programmers"

see

>muh No True Scottsman

Intel should ditch instruction decoder and expose the RISC internals

According to that document C is great.

How is c bad? Its one of the better low level languages. X86 i never understood why it took off.

>X86 i never understood why it took off
Because what other options were there in the early 80s for a viable 16-bit CPU?

> low level languages
>C

x86 wasn't a mistake, 8086 was awesome. i386 was the mistake. amd64 was literal brain damage. ia64 was even worse so i guess you could say we got lucky that amd won that race. cisc chips are great, risc sucks in so many ways when compared to it. the tragedy is we have a disgusting cisc architecture competing with a really good risc architecture and the risc architecture might actually win. intel fucked us all way back in 1985 with their mindless shekel-chasing. i hate intel so fucking much for everything they've done to tech.