AMD BTFOing Intel

>but... but they said AMD will flop

Intel fags on suicide watch

Other urls found in this thread:

newegg.com/Product/Product.aspx?Item=N82E16819117643
twitter.com/NSFWRedditImage

>Implying this is legit
>Implying 1700x wont be 30% below a 7700k

DELET THIS

Clearly you weren't around during bulldozer.
The bulldozer hype started with massive hype "intel killer", etc. But then leaks came out and it was garbage and the reaction was "b-b-but it's just a engineering sample" and "implying this is legit".

Then bulldozer came out and it was worse than the leaks.

If anyone in the thread truly believes their position, you could always take this to the markets and either have a bullish or bearish position on AMD stock. Personally, I'd just ride the hype wave and get off early before things get too shaky. If it fails, then you got out early, if it succeeds well you may have missed some gains, but at least you are still green.

This is probably final product, it's launching in less than a month

Fanboys were hyping Bulldozer, AMD wasn't.
AMD gave a HotChips presentation on the Bulldozer arch a full year before it was released. They admitted there was some serial performance regressions, talked about how they needed to focus on increasing clocks to gain serial throughput. It was pretty explicit. The only one really blowing smoke up anyone's ass was JF-AMD on various forums, and he was nothing but a marketing schlub.

Not at all comparable to Zen. Zen is a big core comparable to intel's latest arch in near everything but size of the FPU.
Zen has been tested by independent reviewers already who confirmed its performance. CanardPC has had a published review of a Ryzen ES for weeks. There aren't any major surprises left here.

>i7 6950x @ 3.0GHz

???? it boosts up to 3.5...

AMD PLZ TRY HARDER

Honest question

How important is multicore with current or near future game engines? I need to make a decision buying either a 7700k or 1700x within the next month as i have my dan a4 sfx arriving. What do

There are plenty of games out there that benefit from having 6-8 cores, but they're not pulling ahead by a huge degree. The biggest impact they have is in providing higher minimum frame rates during intense scenes.
A high clocked 4 core CPU is still decent enough, particularly if it has 8 threads, I wouldn't invest in an i5 right now.

Multicore, Intel is moving to hex cores being standard in about 8 months.

You think hex cores would be on 1151? I hope they dont jew out and change the platform

I doubt a new chip with two more cores would have the same pinout as current quad core Skylake and Kaby Lake chips. It'd probably be on a new socket.

How much does 6950X cost?

Why do people doubt Jim shittekter Keller

$1650

$1700 give or take
newegg.com/Product/Product.aspx?Item=N82E16819117643

Holy shit, intel jewery is real.

If you "smart" you can buy it for $1400

OP's benchmark is great, but the 1700x is OC'd while the 6950x not. 1Ghz the difference.

That being said I'm more interested in 1600x (6c/12t) at sub $300.

I don't think AMD will beat Intel in performance, but with budget. I mean finally you can buy a proper 6 core CPU for a great price.

THE JEWS

>1Ghz the difference.
If that Ryzen chip really is running at 4ghz, then the difference in clockspeed is only 500mhz. The Broadwell-E i7 6950X will run at 3.5ghz on all cores.

I am running Intel core i3-2120 3.30GHz (2 cores 4 threads)
If I upgrade to Ryzen 1700X will I see any difference in compiling times?

You must compare the 6950x to the 3.4ghz zen. 6950x runs also around 3.4-3.5ghz.

So, as far as cards to pair with Ryzen, guess the 480 is the "best" option at the moment?

Not that it matters since id be mostly playing at 1080

Our Ryzen in heaven,
hallowed be AMD,
your IPC be high,
your clocks be high,
on air as on water.

Give us our needed performance.
Forgive us our sins,
as we forgive you for Bulldozer.
Save us in the time of benchmarks,
and deliver us from evil Intel.

Thank you based Jim,
based Lisa, based Raja,
now and for ever.

Ryzen

Good for you AMDrone, accepting that Ryzen belongs in heaven (or perhaps hell) because that shit is going to be dead on arrival.

HAIL RYZEN

>OP's benchmark is great, but the 1700x is OC'd while the 6950x not. 1Ghz the difference.
The 1800X boosts up to 4, they OC'd their sample to emulate it

it truly is the central processor used in heaven

>heaven
>x86

unreal engine

Do you think this based Indian would lie to us?

Please AMD, just hurry up. My sandybridge mobo is struggling to stay alive. only 2 of the RAM slots work now and at no higher than 1333MHz.

I'm guessing yes. I have one of those, and although they were good for the price at the time, new processors are way more efficient. The thing is, i don't think you need that good of a processor for compiling, unless you need every second and have the money to spare.

A 7th gen i5 should be a good upgrade

Do we even have a release date for this shit yet or is it still SOON™

March 2nd is the current scuttle

>i7 6950X @ 3.0Ghz
>@ 3.0Ghz
Jesus fucking Christ you guys

I liked him with mustache better, he looked like indian film villain.

it doesn't go higher

>Max Turbo Frequency 3.50GHz
And in case you don't know how intel CPUs work it stays at 3.5 during load unless you intentionally disable it.
Not to mention that the 6950X actually boost the core being used the most to 4.0GHz by itself
WCCTech is literally FUD the website and this just goes to prove it

AMD's already pushing 8-core as standard in 2 weeks with the absurdly aggressive pricing of their chips.

Watch Dogs 2 will scale with as many threads as it can get.

$300 Ryzen's are competing with Intel's $1000 CPUs. What a time to be alive.

Intel is finished and bankrupt.

amd just caught up to 8 month old intel chips.

>btfoing

get the fuck out of here. more cores and more power draw in a market that doesn't need it and wont use it for another 4 years.

The soon trademark belongs to blizzard, please cease and desist.

it's also just as fucking awful as watch dogs the first so who cares

>more cores
Yeah, and this is a good thing.
>more power draw
Nope.
>just caught up to 8 month old intel chips.
Not as if intel has improved their chips since then.

>yes goyim, keep on buying our overpriced 4C4T garbage tech from 2009 and stalling progress

>thousand dollar cpu
>thousand dollar gpu
>barely even 60 frames per second
THIS
GENERATION
SUCKS

that's at 1080p too
lmfao devs aren't even trying to make their games playable on pc anymore. 0 fucks given for their million dollar games

Feel free to fuck off to Sup Forums if you want to discuss the merits of the game itself. This is a technology board and the underlying engine is perhaps the most scalable and forward-thinking yet seen in a video game, which makes it interesting.

Keep in mind that this is completely maxed out, beyond even the game's ultra setting (which doesn't turn on several options, including extended distance scaling). I don't know why people think engines designed to push past the limits of current hardware are a bad thing.

People lament the original Crysis as the kind of boundary-pushing PC game that doesn't get made any more. Yet every time a hugely-demanding game comes along, people bitch because their 660 Ti won't run it maxed out at 120fps.

The game scales down extremely well to lesser hardware. I pirated it and can run it a solid 60fps with my three year old 290X with a handful of the more advanced options turned off/down (and no MSAA of course). It still looks great.

I expect the 1700x to be about $100 more expensive than a 7700k. While the 7700k will still be the gaming performance king. Reason to get ryzen will be lots of cores for less money.

We can argue all we want but graphics cards will still more important for games.

>Yet every time a hugely-demanding game comes along
because they're not doing anything that some shitty $200 gpu isn't capable of maxing out already.
crysis was amazing visually and engine wise as the closest thing to the realism and destruction was bad company 2 which looks like an n64 title compared to crysis. (and there still are some retarded design flaws preventing it from running well on most modern hardware today)

tl;dr devs are lazy cunts.
you're using a rebranded rx 480 and i'm supposed to be impressed with the fact that it runs the game well?

>rebranded rx 480
Fuck off, retard.

>he thinks his gpu isn't the epitome of GCN
>he doesn't realize nothing but rebrands (aside from HBM) have come out since
the rx 480 is literally the exact same gpu. check their performance if you don't belief.

>The biggest impact they have is in providing higher minimum frame rates during intense scenes.
Literally all that matters. I hate dipping below 40-50fps in certain games.

>tfw got a used 290x for $220 instead of an rx480 for $380

Vega when

HOW WILL THEY EVER RECOVER?

I'm not even the guy with the RX 480 you were responding to, just tired of retards like you equating Hawaii with Polaris.

They are not "literally the exact same gpu." McFucking kill yourself ESL tech illiterate.

>those FX CPUs
Fuckin branchless integer monsters I swear

what RX 480 is $380?

the ones I've seen have all been around $220

Probably talking syrupnigger or kanganigger bux.

This is in Canada with maple tax.

In burgerbux the 290x was $168. Good deal?

>Zen is a big core
For you.

enjoy your housefire.

>kanganigger
Just call them Austrians

niggeroo sounds better too

Will do

>hurr durr it has a different code name!
>hurr durr it shifted nm!!!
>hurr durrr the architecture is the same but they're different i swears!!!
>hurr durr their performance is identical but it's impossible that they're the same!!!
>anyone who says it's a rebrand must be retarded, just look at all those similarities, no way is it the same!

they'll rear their heads again when AMD decides SMT is not enough
The cores themselves were pretty good, just their branch predictor would shit the bed often and when it did, the cache was slow and too high of latency to re-feed the cores, even long after the rest of the core was redesigned it was still too slow.

Yes
Yes, 28nm to 14nm was a good sized jump
Well yeah at the core it's still centered around 16 wide SIMD clusters in groups of four with a shared cache and scheduler. Enough of the render output processors, geometry pipeline, and memory controllers changed that they can be classified as different architectures.
Their performance is not identical, the RX 480 is arguably bandwidth limited, with a 256 bit bus. Even with the DCC implementation added in GCN 1.2, it's enough to keep its performance down. In times where it isn't limited by bandwidth it outperforms a 390X.
Yeah everyone who says it's a rebrand is infact retarded.

Yeah as long as FX wasn't given branches it was pretty much fine. People god damn love their if elses though.

Wonder if we'll ever see a return of Itanium's branch "prediction", it'd go down both paths simultaneously and then drop a path when the condition resolved.

>Their performance is not identical
>same min/max/avrg frames per second in real world tests
here are two eggs, they contain the exact same amount of nutrients and total weight, but i'll tell you they taste different because i painted the shells differently.
no way is it a rebrand user, it just tastes the same and leaves you feeling filled exactly as the other one does.
no way in hell are they the same though. nope nope nope. rebranding doesn't happen, especially not for my super cool technologically advanced eggs.
never in the pc industry has anyone ever rebranded anything.
ever.

frametimes, framedrops, tesselation handling, ROP throughput, pixel and shader engine throughput are all different for the 480

Please try to do 32x tesselation on anything older than it and tell me how it goes for you. Hell, even 16x.

Things having the performance of previous things while being completely different is common in computers mate. The 7750 and 5770 had nearly identical performance in games, but would you say the 7750 was a rebrand despite being a completely different architecture that used much less power on a smaller process node?

But then there is the 6770, that was quite literally a rebrand of the 5770, same core, same codename, same frequency, same performance. You could crossfire a 5770 and 6770 together even. That's a rebrand.

You are correct, they would more likely use AMD64, A.K.A. x86_64, not an outdated 32-bit x86...

>because they're not doing anything that some shitty $200 gpu isn't capable of maxing out already.
damn this is some next gen retardation

Its not just game engine scaling, its system resources. If you have only a dual core chip doing anything in the background (or even background OS tasks because lol windows) eat into resources you'd want for da vidya.

Its why with the huge amount of cache and threads MOAR COARS offers you can do a lot more without your cpu dying a painful death the second you try to play anything intensive.

>$300 Ryzen's are competing with Intel's $1000 CPUs. What a time to be alive.

This is what people don't get. The 1700X is barely behind a 1,500$ chip, while having a price of around 350$. Intel is fucked in several ways.

How's this series, by the way?

>this shit again

Big companies are all using multithread. Indie devs don't make demanding enough games for you to care.

>Big companies are all using multithread.
Only if they have competent coders, like DICE, CDPR or id.

>shitty dual channel
>can't handle 2666 ddr4

>faster than Intel

How?

I'll wait for the real benches.

You don't need quad channel unless you want workstation and AMD will have workstation CPUs too.

Yeah the 16 core Zen's are gonna be good shit. Don't even care about the 32 core that'll just be overkill overpriced crap.

Wonder if they'll try a dual socket skullmeme equivalent for consumers.

>Implying Intel fears AMD and not Apple and Samsung

Accept your inevitable new processor overlords Sup Forums.

Apple and Samsung have already fucked Intel out of the mobile market.

Qualcomm, Applel and Samsung already killed any chance for Intel to get into mobile market.

>Yeah the 16 core Zen's are gonna be good shit.
>quad channel memory and lotsa highly clocked core for 1/3rd of Intel's price
Sounds like a wet dream but please be real.

Only the server chips will be quad channel - desktop ryzen is dual channel. Then again if quad channel matters to you you're buying a xeon anyway (ECC yo) so meh.

Naples is 8 channel

Only for top 32/64 part.
>Then again if quad channel matters to you you're buying a xeon anyway (ECC yo) so meh.
AMD chips usually support ECC.

>AMD chips usually support ECC.

Bulldozer was a long time ago and nothing newer (iirc) has it.

Not the guy you replied to, but try to compile something big (android, chromium, libre office, etc) with a i3-2120.
Chromium alone will take a whole day.

Aight niggas I got a question.

My old i7 920 lets me enjoy games with my rx480, but I'm crazy CPU bottlenecked.

Should I bother throwing a Xeon x5650 and Overclock it? My 920 is a space heater running at 3.6 or 3.8 when it turbos. I FINALLY turned HT on because it helps in shit like hitman DX12.

So Xeon x5650 and stay with this motherboard thats missing SATA3, though I don't really need it?

Or wait for Zen. Anyone able to tell me if the x5650 will give me enough of a boost in single core performance once i've got it overclocked? Should go to 4.0 easily since its 32nm vs my 45nm furnace I'm running now.

I'm in Florida so the ambient temp is allready high, It'll idle at like 55 degrees. Need to dial in my voltages though.

Just fucking wait for Ryzen, it also has new chipset and mobos are fairly affordable.

they oced 3.4ghz to 4.0ghz base to emulate 4.0ghz turbo? how does that make any sense?

He probably betted it on Ryzen not going over 3,0 GHz at launch.
The moustache is now hanging on the wall in Lisa's Su office, above the fireplace.

>fps is the only metric I understand
You are such a dumbass.

We are going to build a wall and make Sup Forums pay for it.

Sup Forums will basically implode when ryzen drops and the Sup Forums spillover will be the only thing left standing in the crater because stupidity is sometimes the best defence.