Jesus fucking christ, it looks like we're actually getting that 16 core Zen with 3.6/4.0 clocks!
Jesus fucking christ, it looks like we're actually getting that 16 core Zen with 3.6/4.0 clocks!
AM4 ?
No fucking way, AMD doesn't have nearly enough pins for this, it's some LGA 3600 or something, enormous fucking package.
>16 coarz at 3.6 base
Have I been reincarnated? 16 core Xeons top out at 3.0 at best and cost like $5000
They also have a fuck ton of cache, pcie lanes, and memory bandwidth.
The Xeons? Some 50 lanes, quad channel memory, and around 40MB of cache.
I think that's roughly equal with these 16 core AMDs
Guess you never heard about the E5-2679v4, which is 20 cores @ 3.2GHz. 200W TDP though.
It's getting hot in here.
Is that all core turbo for x86 or SIMD? Because there's no way its base clock is that high even at housefire TDPs
1800X 95W 8 core and 200w at 4ghz
-> 16 core and doubled the above
And it's useful for what, exactly?
Highly parallelized workflows.
AMD is undercutting Intel in the market the latter are dominating.
This isn't undercutting. This is obliterating.
Base clock is 2.5GHz.
sure for $1000+
servers
video encoding
that's about it
>not games
gtfo
This screenshot is fake
ryzen does poorly even in meme games like aots
So did every actually new arch ever. Its a natural cycle of things.
MAKE MORE CORES!
>Moar bipelines for le ebin processor
AyyMD in 2018
it's a fact of nature that you can't use 32 logical cores to speed up just any task. it's like having 32 chefs to make you a soup. they can't make you 1 soup 32 times faster than 1 chef.
Longer bibeline is Intel trademark tech.
>food analogues
Oh.
it's the same in software just that an analogy makes it easier for retarded underage amerilards to understand.
But what if you want to make 32 soups?
Then you're still better off with 5 chefs that work together.
AMD BTFO
INTEL REIGNS FOREVER
Yes-yes, Mr. Krzanich, but what about 10nm yields?
...
then yes it could be better to have more chefs. but in a video game you're only entertaining one user (the player) whereas with a server you're dealing with many simultaneous users so it's easier to put more cores to use. in a video game you might have more characters and more physics objects etc but currently it's not feasible to put 32 logical cores to good use in a video game especially since a lot of multi-threaded tasks can be done on modern gpus anyway.
That pic is outdated. Someone post 'release skymeme again' one.
this looks like a horseshit photoshop
or actually, you could do this poorly in ms paint too
>is outdated
no, its not. AMD is still not an viable option for the day to day CPU maybe expect budged and selling scam option (muh coraz make a good ad for uninformed users).
Brian stop shitposting and start hiring better engineers instead of donating to Feminist Frequency.
That's what your mom said about my package
Damn poojeets are shilling again.
What does it take for a person to get this delusional?
...
>windows 10
Well that is scary as all hell. AMD has made 200+ watt TDP chips before though.
>using any form of windows for server grade chips
Git gud.
>FX 9XXX flashbacks
nope.
128 pcie lanes per cpu, 8 channels of ram.
64 of the pcie lanes per cpu are for infinity fabric
That's 32cores Naples. 16c one will have rougly half of that.
200w for 16core32thread is very efficient power usage.
It's 16/32c chip at VERY high clocks.
...
>5ghz kabylake flashbacks
It's useful for whatever Intel's Xeons are useful, workstation, server, productivity.
And? It's 16 fucking cores with SMT.
So it's pretty much 2x 1800X. Hopefully there is a 130W 16 core part equivalent to the 1700.
No doubt, but it's gonna be clocked at like 3.0
I was just pointing out how ridiculous power usage gets on Intels 4c8t
DELID
since when is 130W for total system draw during load a lot
It is when the 8 core competition runs lower.
Here's the chip only, measured right off the motherboard.
Notice the 57 watt delta between 4.4ghz stock clock and 4.8ghz?
Yeah that's nuts.
the difference is negligible
where's the 57 watt delta in your chart?
Far from negligible when you put 8 Intel cores in comparison.
7W at idle and 12W at a medium workload is hardly what most people care about in a workstation. the cost of electricity is peanuts compared to the salary of the worker
The 6900k is consistently using 30% or more more power than a 1800X here
even in prime95 which is the most unrealistic torture test the difference is 35W at stock clocks or 23W with both at 3.8 GHz which isn't a huge deal
But I'm sure it was a big deal when the 980 released with 780Ti performance while using 30% less power, it was great then.
Power usage doesn't matter, just another to add to the pile of excuses.
But what if the game needs to get 32 things done to provide that entertainment?
7W at idle and 12W at load is literally negligible
Or a failed Naples cpu
all ryzens are failed naples cpus
Would be great if this was Naples with 2 cores per CCX disabled, imagine the cache and lanes on that thing, not to mention a 8 channel memory controller
But I doubt it, because there's a 12 core part comnig from this 16 core platform as well.
>16 core
Yeah cpu noob here, why not 64 cores?
cost
difficulty of manufacturing
power usage
heat/cooling issues
most people don't need 64 cores
is this real?
Could be real
Not enough room for that much cores yet.
>AyyMD
>put in more cores
yep, everything seems to be in order here.
>But what about .. MUH VIDYA?!
Sup Forumsirgin purge when?
AMD's highest core count CPU has been 16 cores for the last 4 years.
Intel's coming out with 28 core ones and its current top is 24 core.
at most they would need twice the pins as the PGA 1331 AM4 socket.
More likely, it'll be around 2500 pins LGA due to de-doubling and shared pins.
>200W TDP
Would cost way to much to manufacture, unless it was several dies on one CPU, but then it wouldnt be much cheaper than having 2 CPUs (And not have any advantages either).
Consumes too much power = heat. Imagine getting 8 times the heat from 1 ryzen CPU away from such a small size. That'd be like 800W, Would need some unprecedented cooling.
>3.2GHz
BOOST.
>take average 1800X binned die
>peak power usage is 110w under 100% AVX load
>typical usage remains under 95w in everything else
>put 2 of them together on an MCM
>still pretty impressive over all
>now have double the memory bandwidth, PCI-E lanes, and DIMMs per socket
If they use dies with better clock binning than the consumer Ryzen parts they'll be pretty incredible workstations. Price it at $1500 or less and intel can't compete.
>40mb L3 cache
TempleOS in cache?
>16 core
>4.0 clock
LITERALLY WHAT FOR???
Performance, you nerd.
It already happened didn't it?
much more likely that it's naples with 2 defective dies.
Different package and socket. Its not Naples.
Intel stopped being a viable option for anything other than paying extra 50% for 10% higher performance the moment Ryzen was released.
lol fag maybe for workstations but not for gaming and general usage
To burn your house down
>not for gaming
Gaming at above 120hz, and that's just currently.
>general usage
Nooope.
Fake.
Can't wait to install gentoo on it