IT WASN'T SUPPOSE TO BE LIKE THIS!!!

IT WASN'T SUPPOSE TO BE LIKE THIS!!!

Other urls found in this thread:

guru3d.com/news-story/amd-ryzen-14nm-wafer-yields-pass-80-threadripper-cpus-on-track.html
cpu.userbenchmark.com/Compare/Intel-Core-i7-7700K-vs-AMD-Ryzen-7-1800X/3647vs3916
imgur.com/a/wMm1C
ebay.com.au/itm/AMD-RYZEN-7-1800X-/192148576158?hash=item2cbcf2979e:m:m8EOJNPcat5M87krm_IoSqg
twitter.com/AnonBabble

What are you even trying to say retard?

not him but it says great for high performance gaming

DELET THSISS

It pretty much is.

>300
Did it get another price drop?
Good god that's good value

If you enjoy stutter, go Intel

...

...

it hasn't even started yet. Intel is about to get BTFO bad, they better start jewing now, and making exclusive deals with manufacturers.

Lets see if we can purge the autism for a actual, functional discussion.

So the question is this particular processor deliver high performance gaming? I believe for $50 more I can get the i7-7700k; 4.2 GHZ straight out of the box. Best this Ryzen has is 8 cores for the price. However, sticking to gaming, does any games now or in the foreseeable future utilized multi-core/parallelization? Never mind needing eight cores?

The fucking Threadripper releases say it's great for gaming too.

They're just calling all their consumer CPUs gaming CPUs.

Don't modern gaymes spawn 16+ threads?

Moar cores isn't such a bad idea

For gaming alone, I would get the i7-7700K, but I like to run more than just muh vidya games, like recording software and despite how silly this sounds I do a lot of video encoding, would be nice to do both at the same time without either really suffering much of a hit.

If you check any crediable benchmark ryzen is as good as 7700k in frame times.
Intel is better at FPS per core tho.
What this means you can get 60 smooth fps with ryzen but 55-65 with 7700k
so It is trade.

And yes games already use 4 cores so using 6-8 seems legit as multi-cores become more common. But 4core will still last you for 1-2 years just fine.

I use 3 monitors and run many things at once,so 8+core makes sense for me. It still does not for most people [6 does tho]

modern games still use 1 core

*posts the hamster image*

...

are you a fucking retard

I always use more than 4 cores on doom, usage varies but it gets up to 12-16threads.

7/10 troll post
For getting replies

>8-core CPU with 3.7 max frequency
>on stock (included) cooler
>gaming performance comparable, not better, to Intel offering
>in some ways gaming performance is better (steadier frames; higher minimum FPS)
>open platform relative to Intel (no locked features)
>$300
>only three hundred (300) fucking dollars
Literally another holocaust.

If literally all you're doing is gaming, than the 7700k has higher framerates, but only marginally. Additionally, Ryzen has much more consistent frames, and higher minimum framrate. this means that with Intel you might hit a higher max by a few frames, but you're also drop to lower framerates than Ryzen. As far as whether games utilize multi-core; it depends. a few do, most don't need them. However, the future is definitely multi-core. Xbone and PS4 both use octa-core AMD CPUs, and they, unfortunately, drive game development.

lmao people have been saying that since bulldozer and look where that went

Different person but I'd rather have a more consistent, smoother gameplay at 60 fps instead of worrying about FPS memes. 60 is probably the smoothest anyone can see and I'd rather keep the dips in frame rates order to 60, than worrying about having that one moment where I hit 200 fps, when not only my eye can't tell the difference but the monitor (most likely) can't display it.

Devs are lazy. I'm sure some of those 8 cores are dedicated to the shitty OS and maybe 6 max for gaems.

Intel must really fear the new 16 core JEWRIPPER processor to be shilling harder than normal.

if that was even remotely true then you would have seen the results years ago
but it isn't, say it as much as you want but that won't make it true

The human eye can only perceive 30 frames per second. Why do you think movies aren't filmed at 60 fps?

Saying what? Bulldozer was shit. No amount of threading could have made it good. Shit multiplied by any number is just more shit. Even AMD knew it was shit. Games are utilizing more threads, and Ryzen actually has a decent architecture that can compete with Intel.

Will an 1800X beat the 7700k on games made five years from now? Maybe, maybe not. The point is, Ryzen can compete with Intel now, which is the major difference from Bulldozer, and betting on more multi-core games in the future is a better bet than none.

Too many kids ordering prebuilts from iBuyPower and walking around like know it all's because "I built my system"

Also those same people have no fucking sense or ability to think logically and will spend 2 grand on a system to play a game at a frame rate they can't see, or their monitor can't display, when an $800 system can do it on ultra and hold super consistent 60 fps.

Stop memeing and start thinking.

Literally because it saves the movie industries money because they don't have to upgrade their hardware.

Blind d tests have proven the "can only see 30 fps" is bullshit btw, do some research and I mean go watch or participate in true blind tests not some meme driven "research" page.

your retarded, don't reply to me anymore

Funny how all the snowflake manchilds here all do video encoding which rationalizes their purchase

Before Zen, g never talked about it

it is a measure of futureproof-ness. Graphics and resolutions are just increasing in detail. The thinking goes like the following example. Hardware setup #1 can handle 2017's triple-A game at 1080P at 200 FPS can handle 2020's triple-A game at 4K at 80 FPS. Setup #2 can handle 2017's triple-A game at 1080P at 150 FPS can handle 2020's triple-A game at 4K at 40 FPS. Hence, Setup #2 is edging toward (if not already) obsolete and indeed of upgrading.

Toady pls go.

>Before Zen, g never talked about it
Maybe because Intel was choking and jewing the entire x86 industry to death with their "anything more than 4 cores = a kidney" monopoly bullshit.

so glad I held out with my phenom build to get an 1800x, I have been an AMD fanboy since my first build, I wish I remember what it was, but I think it was an Athlon Thunder bird? you could enable overclocking by drawing a line with a graphite pencil between two points on the top of the CPU.

AMD is dropping prices because 1. Insane yields on their 8 core dies, 2. Threadripper soon, which means the prices on Threadripper might be even lower than anticipated.

guru3d.com/news-story/amd-ryzen-14nm-wafer-yields-pass-80-threadripper-cpus-on-track.html

True, the logic follows, but the processors being compared are so close that this argument doesn't apply so well. The differences in performance are negligible but at the same time one offers a more consistent game play for current gen games.

AMD brought the fire to the masses. Intel tried to keep it to themselves. You can see the results.

>the 7700k has higher framerates, but only marginally.
And only if you're not GPU bottlenecked. If you are, they're basically the same. 1080p 144hz w/ 1080 or above = not GPU bottlenecked. Anything else = GPU bottlenecked. (Or monitor refresh rate bottlenecked, but whatever)

I wish AMD could have decent single-core performance. I'm still stuck with Intel because of that.

what do you even use single core for.

>Bottleneck, bottleneck, bottleneck...

*Cringe*

Emulators, especially PC-platform emulators such as PCem, Neko Project and so on. The bottleneck is that the processor is emulated on a single thread, so you need very high single core performance.

Unfortunately Ryzen just about reached the level of my 6 year old i5.

AMD does have decent performance. Decent does not mean the best you sack of shit.

See the image above. My five year old CPU is still better. Not what I'd call decent.

REEEEEE

I can actually understand the low frame rates for movies. The hobbit was 80fps or some shit and it looked terrible, like some amateur zombie film or a soap opera.

Its not like its going to go much past those stock speeds without liquid nitrogen anyways
Why would Intel be afraid of a clocklet that cant boot with fast ram

great ≠ the best
Will you retards ever understand that?

>pissmark
Stop using this trash.

>300
Fuck you Americans

80 fps? Where's that info? Also, go watch any 60fps enabled YouTube video and then turn 60fps off and watch it again. It's like night and day.

>AMD
Stop using this trash.

If that shit was even remotely accurate, the 1800X would have 20k on mulitcore score. That is just an embarrassingly bad benchmark.

Kill yourself.

...

>monitor refresh rate bottlenecked

>It's spam

It would be nice if everyone took each other's advice on this, the boards would have far less idiot children spamming them.

>Darwinism supporter

...

I lied it was 48 fps. It made it much easier to see the fakeness of costumes and hurt the look of practical effects. Games are are totally different story.

You realize that 487*2.8 = 1365 and 7148*2.8=20014? You pretty much just proved my point by linking to a benchmark showing how fucking stupid pissmark is.

>. It made it much easier to see the fakeness of costumes and hurt the look of practical effects
This, I don't think they were quite prepared for how much more detail you can see. everything has to be super on point because you can't hide everything with blur anymore.

Point being movies spread the false info about the 30 fps thing to the mouth breathing masses so they could cheap out on equipment.

Don't forget that FPU performance matters a shit ton for emulating as well, which is where Intel absolutely btfos the completion at 4 cores or less
cpu.userbenchmark.com/Compare/Intel-Core-i7-7700K-vs-AMD-Ryzen-7-1800X/3647vs3916
Moar Coarz btfo
Well I think people aren't stopping to consider that maybe the Hobbit was a rushed bad movie without a good implementation of over 24fps cinematography, which is sad because now that the bad apple has spoiled the bunch nobody is going to want to buy tickets for another 48fps movie

I'm not interested in multicore as I've stated. The single thread result is consistent.

>cpu.userbenchmark.com/Compare/Intel-Core-i7-7700K-vs-AMD-Ryzen-7-1800X/3647vs3916
Moar Coarz btfo

>300 shekels cpu defeating a $460 shekels cpu

Pissmark is still trash.

Sounds like emulators need to step up instead of expecting cpu devs to keep their crutch from falling out.

Too bad for you the real competitor to the R7 series is the i7-6900K. Which costs twice as much. imgur.com/a/wMm1C

Enjoy not overclocking

You can't spread the emulation of a single processor across multiple cores. That generally isn't a problem when you're dealing with consoles because they generally have a bunch of low-power processors, but emulating something as powerful as a Pentium 300MMX is pretty demanding.

The best Ryzen parts haven't even landed.

Mobile Ryzen is going to be a slaughter. Intel better start bribing manufacturers right now before it's too late.

Emulators is one of the only things where a lot of single core helps. But it's not a dealbreaker unless you're doing CEmu or some other alpha quality crap. Those shit emulators the other guy mentioned don't need insanely high single core to run well.

Not all emulators are single threaded shit, RPCS3 benefits hugely from multiple core

Can't hear you over my student loans and crumbling infrastructure, europoor.

Yeah, once that one gets going, Ryzen is going to be the best CPU for that emulator by far.

The real competitor to the 1800x is the i7 and i5 which cost less and get thrown in way more prebuilts
Penny pinching server fags that only care about their TDP/Core ratio plz go
Basically this, imagine a fast quad core overclock-able laptop that you don't have to pay an arm and a leg for. I miss cranking my gen 1 laptop APUs up to 4.0ghz with AMD overdrive lol. Hopefully availability isn't garbage but this is AMD we're talking about

Not gonna lie I might go amd if threadripper cost $600 or less.

Make it happen amd.

>$1000 CPU vs $300-$480 CPU
>Penny pinching
It sure as hell made a lot of difference to a lot of people.

You might get a 10 or 12 core for that much.

1800x is getting scores comparable to the 6900, who are you fooling?

...

>300 dollar cpu vs slower more cores 300 dollar cpu
It only makes a difference to server fags

Neither of those are server CPUs.

yeah i read somewhere it would be 330, not bad for 16 fucking threads

Even though it's got benchmarks to prove it but keep swallowing that Intel cock you shill

If Ryzen and i6900s aren't server CPUs then their TDP/Core ratio matters even less
only if you cherry pick with a binned 4.1ghz cpu that 95% of overclockers wont reach
enjoy your gimped ram support

I'm on x99 currently and I can definitely get an overclockable 8/10 core xeon for less than $300 a year from now and they would be overclockable to at least 4.5ghz, which I doubt threadripper could do.

The (controversial) census is that amd finally caught up with Intel in ipc but the cores are stuck at around 4ghz. I'm willing to trade 500mhz less and compromise gaming for a few extra cores. Not shitting on anyone here, planing out the logical path.

>Ignores benchmarks
>Makes baseless claims

GG

>ignores benchmarks
>makes baseless claims
>n-no you!
GG AMD

Threadripper's going to be out this summer. Waiting a year and buying a used server CPU in the hopes you can OC it is pretty risky IMO. Your call.

No fuck you. Action movies hurt my eyes with their shitty framerates because the whole screen blurs if the camera moves. I get it for artsy movies that want a cinema feel; like how adding grain to a digital film can give it character. However, for action movies pushing the limits of computer graphics, there's no excuse for fucking 24 FPS.

The ironic thing is you probably watched the Hobbit at 24 FPS (it was only played at the higher framerate for select viewings) and think it was shit because you actually payed attention to it.

I got mine for like $305 with free 2 day on amazon and no tax cause third party, overclocked to 3.8GHz, this thing is an absolute beast for the money.

Basically an 1800x now for 300 bucks.

>JEWRIPPER

I want to fucking die!
ebay.com.au/itm/AMD-RYZEN-7-1800X-/192148576158?hash=item2cbcf2979e:m:m8EOJNPcat5M87krm_IoSqg

All consumer chips have a xeon variant, it's the same shit.

I'm on x99 so it's not like I'm exactly starved for performance anyways. The hardware whore in me wants a massive socket with a massive chunk of waterblock on the eatx zenith.

AMD really hit something here for the richfags, I'll tell you that.

>Claims cherry pick
>Immediately cherry picks

Kek

>all benchmarks done with fucking 1080ti or titan xp
maybe i wanna pair my $300 cpu with a $400 gpu huh how about that you fucking cunts

Imagine a supremacy evo for the threadripper... at least 6lbs!!!!! Fucccck