Does buying used Xeons for gaming PCs still make sense?

Does buying used Xeons for gaming PCs still make sense?

Other urls found in this thread:

tweaktown.com/news/59152/amd-ryzen-threadripper-16c-32t-epyc-32c-64t/index.html
en.wikipedia.org/wiki/Product_binning
ark.intel.com/Search/FeatureFilter?productType=processors&ECCMemory=true&FamilyText=Intel® Celeron® Processor
ark.intel.com/Search/FeatureFilter?productType=processors&ECCMemory=true&FamilyText=Intel® Pentium® Processor
ark.intel.com/Search/FeatureFilter?productType=processors&ECCMemory=true&FamilyText=Intel® Core™ Processors
supermicro.nl/products/motherboard/Xeon/C600/X9SRL-F.cfm
supermicro.nl/products/motherboard/Xeon/C600/X9SRA.cfm
ark.intel.com/compare/97128,97478
ark.intel.com/compare/97128,97478,92991
twitter.com/NSFWRedditGif

If you can get a cheap motherboard then yes.

...

...

Teddlyfleece?

Who is this semen demon

Eww. She reminds me of the egg lady from pink flamingos. Something about the face.

Her face is rounder than my balls, and puffier too.

this pic upsets me

There's a blacked man behind her!

Each tit is bigger than that nig's head. She fat as fuck.

that is not a blacked man, that man blacks

>stomach wider than hips
>fatrolls
>judging by face hispanic or a mutt
>fucks niggers
Wouldn't even rate T B H

>fat
round

AYO HOL UP
NO WHYTE BITCHES WELCOME IN WAKANDA

>round
fat

Hah

Obese, is the proper term to describe that.

BLACKED

NO

why would you buy outdated hardware to install updated hardware into?

you can buy a quad core amd

you can also buy a quad core intel

That's a big boobs.

OR you could buy G4600 a.k.a. The King of CPUs

>jewal core
I'd rather buy from AMD crap.

Don't.
Xeons are designed for long life and big duty cycles. Gaming computers need frequent enough upgrades for long life to not matter all that much, and big duty cycles are not a concern because you'll not be running a server and so your box will get sleep from time to time.
I'd invest the savings in a better GPU or maybe ECC RAM.

xeons are absolute trash for gaming man, low clock speeds and slower memory

since when has xeon ever been used for gaming except by people who bought the whole itanium and xeon gaming meme? hahahaha
xeon and itanium are specifically designed for servers Using them for gaming would be like driving a train pulling a million tons of coal wagons behind it compared to driving the orient express on a day trip from barcelona to hong kong. Dont know much about itanium chips but I know enough to know they suck big style for gaming

me again
I just found this
Itaniums CPUs are not x86, it is based on the IA-64 architecture which was Intels orginal approach for desktop 64-bit before AMDs x86-64 extension overtook the market.

I don't think it can support 32-bit games, as software suppport is pretty dismal. Even if it can it would be through emulation, so Ill say go for something else.

and then verified that it's true, itaniums cannot handle the 32bit instructions found in most games

Pretty sure Itanium could run x86 assembly through the use of microcode. It was just about two and a half times slower than an x86 CPU of same clock speed.

>2018
>dual core

ECC RAM is useless in a Windows based gaming machine. It's only useful in machines with filesystems and software that make use of its functionality. You're not doing anything important enough on a gaming machine to warrant its use (and Windows doesn't have a filesystem that makes use of ECC). Unless your gaming machine also doubles as a file storage server where data integrity is important.

I have an e5-1620v1 that is 3.2 and turbos to 3.8. However, it's 130w TDP. My i7-4790k is 4.0 to 4.4 (tubros to 4.8 on air at 1.275v) and has an 88w TDP. High clock speed Xeons are available. But they're high TDP to go along with it.

The latest quad core v4 e5 xeons that clock high and turbo to 3.8 and 4.0 GHZz have 140w TDPs.

So the problem with Xeons isn't low clock speed. It's how much power they pull for the high clock speed. Most "consumer" cpus are Xeons that didn't meet standards (doesn't clock as well as it should, some parts of it are broken) so the chips are clocked lower, and have other areas disabled and are then simply rebranded as the consumer or other line.

If anyone remembers when AMD had yield problems with Phenom, they disabled the faulty core and sold the chips as triple core. Intel does the same thing.

>If anyone remembers when AMD had yield problems with Phenom, they disabled the faulty core and sold the chips as triple core. Intel does the same thing.

This. Intel Xeon 115x socket chips share the same die as i5 and i7. They simply disable features.

AMD's Ryzen chips are apparently Epyc that didn't make the cut.

tweaktown.com/news/59152/amd-ryzen-threadripper-16c-32t-epyc-32c-64t/index.html

Ever since Xeon's became a meme, Craigslist/eBay sellers have taken notice, and it's at the point where you're better off buying a used Haswell/Z97 setup of some kind if you're on a budget, since the price difference between both is hardly anything major, especially since both are being unloaded en masse because of Ryzen upgrades. An OC'd 4770k has the exact same singlethreaded performance as a 7700k, which is all that will matter for games if you pair it with a good GPU. They were great back when nobody knew about the fact they were alternative i7's, but that ship has sailed

Gaming doesn't need many cores.

>Most "consumer" cpus are Xeons that didn't meet standards

It's called binning you gormless faggot.

en.wikipedia.org/wiki/Product_binning

>binning

I need a used GPU to replace my 760 gtx

what is the best bang for my buck

Oh dear oh dear.

>outdated
>still does better than the whole FX series line which is already fine for everything including gaming
>g4600 BTFOs FX-8350 in single thread

Fucking gamer kids, leave my board

I bought an old HP DL380 for $100 and put some fucking garbage GPU (560?) in it and could run plenty of older games. And this was in Australia. You could get a server that's just a couple years old in America for the same price.

Put a 1060 or one of those new 9** GTX cards and you could run any modern game.

The PRIMARY downsides are:
>sound (servers are loud)
>Energy expenditure
>Compatibility (They're designed to be servers, not run gay GPUs in SLI. The xeons usually aren't optimised for modern games either.)
>Clock speed of CPUs (xeons tend to be quite a bit slower and any games that require heavy single threaded performance won't run as well as some other i3/5/7)
>Requires ECC ram which is much more expensive

If you're smart and know your servers then you could shop around, buy one that allows you to strip the motherboard out and set it up in a PC rig with a more moderate fan. Best case scenario is you find a server where everything goes right. Worst case is that nothing is compatibile, the firmware sucks dick for PC use and your GPU doesn't work.

>The xeons usually aren't optimised for modern games either.)

>>Compatibility (They're designed to be servers, not run gay GPUs in SLI. The xeons usually aren't optimised for modern games either.)

There is nothing unique about consumer chips that makes them better (or Xeons worse) for gaming. They are all fundamentally built on the same die design and then damaged or broken areas disabled and then sold as lower tier or different product line processors.

>>Clock speed of CPUs (xeons tend to be quite a bit slower and any games that require heavy single threaded performance won't run as well as some other i3/5/7)

Again, there are plenty of high clocked Xeons. They have no problems with achieving high clock speeds. The problem is how much energy they pull for that clock speed.

>>Requires ECC ram which is much more expensive

Do not. They can use both regular and ECC RAM. That is why certain products in the Celeron, Pentium and i3 ranges could be found in high end home NAS systems supporting up to 32GB of ECC RAM. ECC is simply a disabled feature in consumer lines. AMD on the other hand do not disable ECC support. What limits ECC support on AMD consumer lines is motherboard manufacturer supporter.

750 ti

A list of Celerons that support ECC.

ark.intel.com/Search/FeatureFilter?productType=processors&ECCMemory=true&FamilyText=Intel® Celeron® Processor

A list of Pentiums that support ECC.

ark.intel.com/Search/FeatureFilter?productType=processors&ECCMemory=true&FamilyText=Intel® Pentium® Processor

A list of Core processors that support ECC.

ark.intel.com/Search/FeatureFilter?productType=processors&ECCMemory=true&FamilyText=Intel® Core™ Processors


There is no such thing as "optimised" processors. Only "binned". And binned processors come from nothing more than faulty dies in a particular line, or perfectly functional dies that have had certain areas disabled to fill a lower tier line.

>binned processors

en.wikipedia.org/wiki/Product_binning

How can one person be this retarded.

The point isn't that modern xeons are different to modern i7s. The point is that thr xeons you're buying used for cheap are generations behind and don't support more modern advanced technologies or even DDR4 ram. It's not fair to compare $1700 CPUs to $300 CPUs. Nobody is running out to buy new xeons for personal use.

>higher clock speed
Read previous point

>ECC support
I was talking about servers because if you're buying used xeons, you're probably better off just buying the entire used server. It's cheaper than buying just an individual motherboard and individual xeons. In that case, ECC ram is very frequently required on older boards.

>The point is that thr xeons you're buying used for cheap are generations behind and don't support more modern advanced technologies or even DDR4 ram.

They're not generations behind. And Xeons are the ones that support the advanced technologies. Or the ones binned specifically from. Such as the lists I provided supporting ECC. DDR4 is not a requirement for computing today other than someone wants to talk about how they're using the latest and greatest.

> In that case, ECC ram is very frequently required on older boards.

It is NOT a requirement. Below are two motherboards I own.

supermicro.nl/products/motherboard/Xeon/C600/X9SRL-F.cfm
supermicro.nl/products/motherboard/Xeon/C600/X9SRA.cfm

Both boards support non-ECC. In fact all their boards support non-ECC. However, if you require error correction and more than 64GB of RAM, you can only use ECC. It is not a fucking requirement.

not a better gpu..

>It's not fair to compare $1700 CPUs to $300 CPUs. Nobody is running out to buy new xeons for personal use.

You have no clue what you're talking about. Nobody is talking about buying 8, 16 or 32 core chips.

Quad core Xeons are often only marginally more expensive than their consumer line counterparts (E3 and Core chips share the same die, only with bits disabled for the Core line, except in certain chips). You're simply paying more for the parts that haven't been disabled, or a higher clocked chip.

ark.intel.com/compare/97128,97478

About $39 more expensive for a slightly higher base clock and ECC support. That's it.

Add a slightly older E5 chip in and what do you get?

ark.intel.com/compare/97128,97478,92991

Under $300 MRSP when it was new. A slightly slower clock speed and boost. More cache, higher TDP, Supports a fuckton more memory. And up to 40 pci lanes. Meaning unlike the I7 and E3, instead of SLI in 8x by 8x, you have full 16x by 16x. So the argument by >They're designed to be servers, not run gay GPUs in SLI.

Is fucking redundant.