Guys what do we do now Intelbros

Guys what do we do now Intelbros.

This is another shoah

fake

I know, we should post 1366x768p , that's more important than 4k, we're not poorfags I swear.

But do remember to buy Kabylake for that 4k netflix, the gaming performance is a-a-a-almost as g-g-good...

Lower resolutions are more taxing on the CPU
AMD game, at the high resolution gpu bottleneck they wanted no surprise

...

...

Not fake, it actually does really well in tomb raider.

You can stop making the same retard shit threads and find something better to do with your life, or just fucking kill yourself.

Yeah, I know, but Intel is worse in 4k gaming than AMD so that can't be helped, also uses more power and has twice less cores.

Since i got a 4k monitor I usually downscale it to 768p to experience games to the max.


Fucking poorfags

Can Intel catch a break?


>slower in highres gaming
>intel users are poorfags with a 640x480 monitors
>8 core uses 50% more power than AMD's 8 core
>costs $500 more on top of it
>their 4 core can't handle gaming over 768p, and even then needs deliding, sanding and a custom water loop


How can we help our Intelbros, my fellow AMD friends?

Sure showed me
>2 Fps in one game
was the extra 200 dollars worth it

Also a lot faster outside of gaming :)
I didn't buy a 1800X yet, too expensive, but a overclocked 1700 is a real killer.

>Does worse in adobe and gaming
>Desperately look for the two games out of fifty that perform better
how do you find the time to go sifting through all of the benchmarks that don't show your side

>their 4 core can't handle gaming over 768p, and even then needs deliding, sanding and a custom water loop
what

>does worse in 4k
>literally denying reality
>he still wants 4 presshot cores that run slower, hotter and can't run a VM to save their life

This is platinum gold.

>fx8350
>nearly same results
Its like the gpu is bottleneck
Really makes you think

I forgot that playing at a resolution for human beings is bad, we should drop it down to 720p because that's what Intelfags can afford.

You literally can't make this shit up

>CPU matters less for 4k
>b-b-b-but muh 768p look at 768p those are important looook!

you should pair rypoo with dual rx480
ayymd told me it's better than gtx1080

>runs worse at 4k in ONE game
>blanket statements everything
ok bud, you do that

It isn't, that's why I'm getting a 1700 and a 1080ti for 4k gaming :) AMD's current GPUs are uncompetitive but thank fuck their CPUs aren't :P

The kike damage control is real.

Their entire argument is literally

>don't test at 4k!
>768p is indication of REAL PERFORMANCE 4k is FAKE
>buy kabylake for 4k netflix!

>768p
What the fuck?

Intelfags are pretty proud their chips game better at 1366x768 and 800x600, sometimes they include 1080p too.

But are very troubled by 4k, I hear 4k was a codename for some kind of disaster in Israel.
It's still under investigation.

Serious question: how the hell does the Zen chip do worse at lower resolutions yet somehow better at higher ones? Isn't the CPU pretty much only responsible for geometry and physics these days?

Also who else waiting for 6/4 cores? Poor fags unite.

Dunno, there's probably extra load put on its memory controller and caches at that resolution, Zen does indeed have a lot more throughput as the 6900k and the higher core chips display consistent higher results in both cases so they're no doubt being put to use in some way.

The total differences aren't that big though, but still, a far cry from 1080p.

Because at higher res the gpu is the lumiting factor. When more powerful gpu comes around the results at 1080p will show at 4k.

Did you make a more powerful GPU than a 1080Ti yet? I can't wait to see it.

All I'm hearing here is

>j-j-just wait for new $1000 GPUs to c-come out and buy them

why the fuck is 7700k doing so badly compared to all those other cards there?

It's been a fact since gpus were created. If you want to be a fanboy and ignore simple facts and history it's your choice.

It's generally a weaker chip.
Now before some faggot jumps on this post it's the fucking truth, just as a 12 core xeon is more powerful than the 6900 or ryzen, it's a simple case of throughput, aka total chip bandwidth.
Which might not necessarily mean better game performance.

>Which might not necessarily mean better game performance.

Unfortunately normies can't get that figured out. I had a coworker think about getting a Quadro card for PC gaming.

How's that 7700K buyer's remorse settling in?

KEK

>4K where i5s perform on par with 8c/16t CPUs and GPU is clearly the only factor that matters.

Really makes me think.

Let's see how that 1800x of yours does in 1080p or let's say 4K and 144 fps 3 years down the lane.

kek.

>Pootel can't into 4k
>resolutionlets

>Lower resolutions are more taxing on the CPU
No they are absolutely not.

Are you trolling or are you serious, do I even need to explain this?

It's not the CPU's job to be massively multithreaded, it's the GPU's: AMD kids are illiterate in technology.
They pretend a CPU is a GPU.

>d-dont listen t-to the f-f-fake benchmarks anonfriends
>only look at Intel approved benchmarks

>Lower resolutions are more taxing on the CPU

You're retarded, as are "reviewers" who test games at 720p/low to show CPU performance. Higher resolutions and settings result in more draw calls to the GPU, which in turns places more load on the CPU. The best way to test games is on the highest settings and resolution you can get WITHOUT running into a GPU bottleneck. That actually takes time and effort to find the sweet spot for each game though, so it's absolutely no surprise that these "reviewers" just go with the easy option and create a completely unrealistic scenario.

>3 years down the lane.

Lmao
At that time I'll either have the Intel 17700k or the 4800X

>Higher resolutions and settings result in more draw calls to the GPU, which in turns places more load on the CPU
This is completely wrong. Higher resolutions max out the GPU which flattens the range of potential framerate, and since all of the test rigs use the same GPU, the result is that all of their CPUs appear to have very similar performance.

it does well in 1440p and 4k
only in inferior resolutionsit loese to 7700k
it's somehow makes it bad CPU all of a sudden

>AMD game
fyi it's nvidia game

>WITHOUT running into a GPU bottleneck

God you're dumb. Go look up what a draw call actually is, retard.

>max out the GPU which flattens the range of potential framerate
do you know how CUU and GPU talk to each other right though magic bus on PCIE route with cool little driver?
when GPU maxed out it talks more to CPU making CPU do more work, what does it tell us?

>WITHOUT running into a GPU bottleneck
that's a nice no true scotsman you've got there, glad your point is now completely irrelevant since in almost every real-world case this will indeed cause a GPU bottleneck and thus result in exactly what I stated in my post

>hurr ur dum if u disagree with me
abandoning thread, fucking children lol

Fuck man, I must be high because that shouldn't have made me laugh as hard as it did.

>Titan X

Based nVidia drivers

That is really impressive considering the two sandwich between that ryzen chip are +$1000

Why don't tech journalists just benchmark all common resolutions so readers can see what's best for their current or future use case?

I don't want to shit on your parade, but
>Loses performance in dx12
>Maxwell compared to similar time frame AMD cards
I mean yeah, this game seems to make better use of AMD cards, but still.

you don't say, tech journalism is even worse than videogame "journalism" in that regard

That's not fair according to Intel.

Since 768p is a REAL CPU test and not something where you actually work your CPU to the bone like POVray.
4k also isn't important either, because the CPU barely makes a difference there but Intel still loses so that's a no go

You're abandoning the thread because you got blown the fuck out. Pic related is a properly-conducted CPU benchmark, with Watch Dogs 2 running on Very High settings at 1080p.

>HURR DURR GPU MAXED FRAMES ALL SAME!!!

Wow, guess not, huh, because the 5960X blows the fuck out of a higher-clocked Skylake chip.