Probably the most sperg idea of our lifetime in tech cycles is that 64bit processors must always run 64bit operating...

Probably the most sperg idea of our lifetime in tech cycles is that 64bit processors must always run 64bit operating systems. Not only that is retarded, but it makes the system much slower in several cases. This is especially true on processors that could run 64bit instructions barely.

The RAM being no more than 4GB in those systems is not the most important reason, contrary to popular belief. It's just that the system does not run software usually that would take advantage of 64bit instructions in a way that makes the old system run faster. There''s more than that.

The problem is severely exacerbated by the fact the slow hard disks on those old systems struggle to load the extra size of the binaries. In case you haven't noticed, 64bit binaries are much larger than 32bit binaries. e.g. Win 10 32bit iso is 2.9GB vs 3.8GB.

Other urls found in this thread:

linux.slashdot.org/story/13/12/24/2025228/linux-x32-abi-not-catching-wind
cds.cern.ch/record/1528222/files/LHCb-TALK-2013-060.pdf
twitter.com/NSFWRedditImage

Yeah but there is no Steam for 32 bit Linux OSs.

Check mate, OP.

>Probably the most sperg idea of our lifetime in tech cycles is that 64bit processors must always run 64bit operating systems.

I did not notice such idea.

At least linux can be made to run with xfce or other retarded minimalist system that reduces the binary sizes significantly. Windows systems that must run for most people (because they have to run windows-only software for their job) get a severe hit in performance in several cases.

I recently had one of the first dual core intels on 64bit run 64bit windows and it has hell to even wait it load. Getting it on 32bit windows 10 made it about twice faster.

The fact they have shit hard disks as well makes it worse. Those binaries are about 50% larger.

>new stuff should be able to run perfectly on my ancient hardware

It's very common. Do a simple google search. The overwhelming majority of forums automatically suggest "derp, if it's 64bit processor better run 64bit windows anyway, derp".

It's a retarded idea. Those binaries are about 50% larger. The mere fact their shitty hard disks on those old PCs have to do 50%+ more work makes it a shitty idea.

Especially if they don't even have a lot of memory.

Yep, that's the stupidity.

Good luck doing anything on a system with a max of 4gb of ram, especially if you have a video card made in the last 10 years (the OS+gpu will be using about half of that just idling). Sure PAE exists, but do any applications that people actually use even support it? The answer is no

> Win 10 32bit iso is 2.9GB vs 3.8GB
That's because Windows actually carries all 32bit shit to provide compatibility to a 32bit software.

Correct. 64bit Windows is like two OSes in one.

That's stupid. Most people run Chrome as Facebook and Youtube machines. It's very respectable to have a 10 year old computer on those and it's retarded to cripple them with 64bit windows if their hard disk is shit and the RAM not even more than 4GB and the CPU one of the first that could barely do 64bit.

That's retarded. I challenge you to compare the binaries (.exes or .dlls) of any program. You will find that the 64bit binary of the same version is always considerably larger.

Ya but the number of people that are still using computers with those specs is tiny. Anyone with computers that old just uses their phone for everything now. And Chrome on my computer right now is using close to 2gb of ram

Other than commercial machines (which are usually still running xp or some shit, and are pretty much used for one purpose only), hardly anyone has those anymore. I've also never noticed a difference between 32 and 64bit windows on my old laptop with specs similar to what you described, and I installed and tested both out for that very reason

Linux has had an ABI specifically for 32bit programs on 64bit Kernel but not many people seem interested
linux.slashdot.org/story/13/12/24/2025228/linux-x32-abi-not-catching-wind

>The problem is severely exacerbated by the fact the slow hard disks on those old systems struggle to load the extra size of the binaries.
>Win 10 32bit iso is 2.9GB vs 3.8GB
No, the issues isn't the HDD being slow, but windows trashing the disk like a retard even when there's plenty of free RAM, the HDD can't do more than one thing at once and the performance goes to shit.

This

Last time I tried facebook on an older machine the autoplay videos and other garbage pushed the CPU to 100%, 64bit or not isn't going to make a difference.

I don't dispute that, it's actually true but I think the size difference is closer to 20% larger on average. One advantage of 64bit binaries is that they're able to target instruction set extensions that i686(Pentium Pro) compatible binaries cannot.

That's just bullshit. The overwhelming majority of the developing world is running those systems, the developed world runs those systems routinely (e.g. for basic office work or basic living-room facebook-browsing) and not everyone is running on the streets carrying an android device 24/7.

One of the most retarded ideas of the tech world is that desktop computers or laptops PCs in whole are obsolete and they must seize to exist.

Only a retard would believe we must always be mobile and we should abandon the customization advantages.

Poorfag pajeets : the thread

Even mobile is 64bit now.

With Linux you can run 32bit on 64bit but there's usually no reason to. Everything available as a package can be compiled as 64bit and run native. Having 32bit support means you have to download the 32bit libraries which uses disk space. Since it's rarely needed on Linux it may be a waste of space.

>Linux has had an ABI specifically for 32bit programs on 64bit Kernel

That's pretty cool. I had found about it very recently when I was looking into this problem. I believe it's this very problem that it tries to solve.

But yeah, people are too retarded to look into it so deeply (and it's not even that deep really).

Even "tech experts" are into a "derp, 64bit cpu must always use an 64bit os" loop.

Not to mention dependency hell

>CPU to 100%, 64bit or not isn't going to make a difference.

Sometimes that is true, but I believe that's only a part of the problem, because those people routinely reboot their computers and even a "fast shutdown" of windows 10 can make it extremely slower on 64bit binaries compared to 32bit.

In some rare cases they even trash the disk more for the more RAM uses, especially if it's on 2GB. I have seen that issue personally. The CPU could barely do 64bit and the ram was 2gig.

More like economical computing for the masses. When some people use computers as Facebook machines why make their experience worse. If someone wants to run gaming and stuff then yes, we're talking about pajeet computing.

It's also a matter of security. Having fewer ABIs means a smaller attack surface. If a piece of malware is 32bit then it cannot be run on 64bit Linux without the 32bit support libraries. If those libraries do not exist then the OS is essentially immune. Every bit helps with security.

>The problem is severely exacerbated by the fact the slow hard disks on those old systems struggle to load the extra size of the binaries.
HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA


HAHAAHAAHAHAHAHAHAHAHAHAHAHAHAHAAHAHAHAHAHAHAHAHAHAHAHAHA

No, we have people complaining that their Pentium Pro with 2gb of ram runs w10 like shit in this thread.

Compare the sizes of binaries of any program you like on 32bit and 64bit of the same version, see that they are considerably different, and then call yourself a retard.

That's just the standard 32bit multilib, x32 is a different ABI which can use the more registers and instructions of x86-64 processors while reducing memory usage by using 32bit pointers.

cds.cern.ch/record/1528222/files/LHCb-TALK-2013-060.pdf

>Compare the sizes of binaries of any program you like on 32bit and 64bit of the same version, see that they are considerably different, and then call yourself a retard.

HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA


HAHAAHAAHAHAHAHAHAHAHAHAHAHAHAHAAHAHAHAHAHAHAHAHAHAHAHAHA

It surprised me but it actually runs great on 32bit. On 64bit windows 10 it was horrible. I think the main offender was the 2GB RAM that was thrashing the disk but on 32bit it only uses 1 to 1.5gig but also the mere fact the hard disk is shit exacerbates the problem due to much larger binaries.

Exactly. It's pretty cool and solves the problem of this thread very specifically. But it's too techy for most people to even care.

It might be a good business move if it's run on those simple netbooks that are practically Chrome machines.

Why waste more memory use on a simple machine and why waste the instructions simultaneously?

Depending on the ISA (Intel, whatever), various instructions need to be word-aligned (which may cause them to take more space), and numerical constants need to match the processor's word size. Binary code is full of int constants: your loops start at zero, etc. In short, the likely cause of the extra space is int padding and word alignment of constant data.
Depending on how it's compiled, 64-bit code may well use more memory. Data structures also want to be word-aligned for fast access, and the compiler may choose to pad your structs. Also, depending on the compiler, some int constants may change size. (That's why you always see typdefs like uint32: guaranteed size.)
One reason for writing a 64-bit program is to be able to use more RAM. This requires using 8 bytes for an address instead of 4 bytes in a 32-bit program. That might increase the program size.
On the other hand, if your program handles data that is intrinsically 64 bits, it can do that in a single instruction where a 32-bit program would use two instructions. This would make the program smaller and faster.
The char 'A' uses a single byte in either case, so that doesn't matter. A minor part is that common 32-bit integer operations suddenly need slightly longer opcodes (since the default ones are 64-bit) or vice versa.
And 64 bit is generally about 20% larger binary size.

Alright far enough, and I mentioned office work vaguely when i said "commercial machines", but I still don't really see the problem. You can run 32bit windows if you want, and the vast majority of programs have 32bit versions still. Its like, you can run straight pipes on vehicles older than a certain year, but newer cars are still going to come with catalytic converters.

One thing that you can't argue is that on 64 bit systems, things like memory leaks just aren't a big deal anymore. If a program is using up 3gb+ of ram, you dont even notice it unless you start trying to do tons more shit at once. There is no way I can go back to being limited to 4gb of ram again on anything other than my phone

ChromeOS is based on Gentoo so it shouldn't be too hard for google to implement it.

Size of binaries doesn't matter any more since 1Tb HDs cost $50.

Are you some kind of retard.

You can represent 32 bit datatypes in 64-bit processors registers, there is no performance impact for having bigger allocations.

>read/write speed doesn't matter
Retard

Op has clearly been sperging out because his old athlon 64 cpu and 200gb ide hdd can't run windows 10 flawlessly

ITT: OP never heard about SSD

>Implying any Chromium based browser will ever use less than 4GB of ram just showing the google home page.

64-bit was a mistake. 32-bit systems were much faster and lightweight.

More RAM is usually a better idea, unless your OS is shit at caching filesystem entries.

Why should I only use half of my cpus potential

So basically the entire OS is 64bit technically but software would be able to utilize 32bit sized registers to improve code density? Interesting, it's kinda like the Thumb 16bit ISA that's a part of ARM processors if I'm understanding that right.

64 bit code sometimes performs faster
upgrade your pleb tier pc

/facepalm. The whole point of this is to make older hardware that is still lying around or new but weaker hardware (e.g. a simplistic tablet) to run better. It's not about my personal machine that can run 64bit pretty well, much better than what most have.

>Chrome on my computer right now is using close to 2gb of ram
That's just all the data it has collected and will send to Google. My Firefox runs at

>posting this in 2017
>processors that could run 64bit instructions barely
>slow hard disks

32-bit was a mistake. 16-bit systems were much faster and lightweight

8bit still rules. Z80 ftw

Nobody stops you from using 32 bit binaries on the 64bit OS.

Operating systems should be 64-bit to make applications that need 64-bit pointers possible. Most applications should be x32. I would note that this is not the same as legacy 32-bit applications, but rather, uses the 64-bit instruction set with 32-bit pointers. Unfortunately, only Linux really has an x32 API.

>Most applications should be x32.
Most distros already maintain two sets (32bit and 64bit), an expanded 32bit ISA would require even more space. x32 simply doesn't offer enough benefits to make it worthwhile for distro maintainers which is why it has seen little usage. It's probably only suitable for embedded work or for source based distros like gentoo

Sequential read speeds of HDDs are fast enough to not have any noticeable difference between 32 bit and 64 bit binaries.

degenerate poo in the loos the thread

It's probably *best* suited for embedded work or for source based distros like gentoo.

Potentially all software could benefit from it but the reward is simply not worth it in most cases.

PAE is an abomination and needs to die.

Also, original 32 bit addressing didn't allow for more than 1GB RAM. An address extension allowed for 4GB.

But by your logic, we should be using 386's...

that's not what that is

1) it has proven faster to perform binary operations on 64 bits when the cpu architecture is amd64, compared to an amd64 cpu performing operations on 32 or other bit sizes.

2) data structures such as B-trees help with the slow-disk retrieval problem by quite a bit. It just takes competent programmers.

3) get a ssd you pleb, the future is here. they even have PCI-SSDs.

4) write more assembly if you're really trying to squeeze more efficiency out of your cpu. You'd be amazed at how fast shit runs when it's written specifically for your cpu+register size.

That's retarded, most of the time on those shitty PCs is spent running the default binaries. Running Chrome on 32bit is not going to have a difference on those Facebook/Youtube machines. The majority of the time, RAM and CPU time is spent on the OS itself.

true

The argument is mainly for people that can't be arsed having more than a Facebook/Youtube machine. They don't need to spend more for the delusion of a techie that it's more modern since they will never run something more than a browser. Running 64bit binaries that will make their hardware run worse is retarded, provided that will happen with their shitty hardware.

Any respectable program will have data/code ratio of hundreads or more. It is irrelevant.

64bit programs also might run faster because the compiler has more registers to work with, thus can optimize more aggresively, which is a boon regardless of amount of memory.

>It is irrelevant.
>64bit programs also might run faster

That's exactly the retarded thinking the OP explicitly refers to.

It's not a coincidence the x32 ABI was invented (not i386).

Or, when RAM is barely thrashing a disk, it's important.

x32 still needs a 64bit OS. x32 is a strict subset of x86-64 that is deliberately limited to using 32bit pointers.

>Win 10 32bit iso is 2.9GB vs 3.8GB.
they'd be closer to the same size if the 64bit edition didn't need to carry 32bit libraries to support 32bit software

I wonder how much of a performance overhead there is when all memory references have to be 8 bytes instead of 4. Less stuff overall will be able to be in the caches.

If you weren't a fucking neet, you would know how fucking retarded you sound right now.

This is all you are getting out of me.
1/10 for getting me to reply

I'll take this bait.
>Not only that is retarded, but it makes the system much slower in several cases.
The only real case is 4th level page tables which isn't actually an issue because of 1000+ entry TLBs. Also you can opt out of this on Linux for x32 encoding which let you keep the 3 level page tables but use the x64 opcodes.
>It's just that the system does not run software usually that would take advantage of 64bit instructions in a way that makes the old system run faster.
I mean duh. Being able to process literally TWICE the data per cycle as before it kind of a big win. Add 2 64bit ints in x86 mode. OH ILL WAIT. How long does that take you? 4x load 2x add 1x branch LMAO
>The problem is severely exacerbated by the fact the slow hard disks on those old systems struggle to load the extra size of the binaries.
We have SSD's and NVMe's that are literally 100x faster then spinning then spinning rust.
>In case you haven't noticed, 64bit binaries are much larger than 32bit binaries. e.g. Win 10 32bit iso is 2.9GB vs 3.8GB.
LOL

I r8 this b8 an 8/8 dont h8 post more I cant w8.

only expensive high end devices