Intelfags on suicide watch

Intelfags on suicide watch
>muh IPC
Enjoy your stuttering mess

Other urls found in this thread:

youtube.com/watch?v=uXepIWi4SgM
youtube.com/watch?v=l6WgbfG_z-k
youtube.com/watch?v=t2ECRoGdOlE
youtube.com/watch?v=mVNRNOcLUuA
youtube.com/watch?v=F8bFWk61KWA
twitter.com/SFWRedditVideos

if I had a nickel for each thread on Sup Forums I see where the picture is some graph with bars and the rest of the thread is corporate shilling on all sides
I'd be rich

all I see is Intel wining lmao @ OP

>I do not understand 0.1% lows

>m-muh fps in a pre-beta game
lmao amdfags holding on tight

I don't understand 0.1% lows in the context of these graphs. Quick rundown please.

youtube.com/watch?v=uXepIWi4SgM
youtube.com/watch?v=l6WgbfG_z-k
youtube.com/watch?v=t2ECRoGdOlE
tl;dr intel stutters are more obvious than 171->154 average difference

>Inlet
Muh bigger bars

um shouldn't games be using either dx12 or vulkan by now?

R5 1600 overclocked on a B350 board is really the most fucking amazing CPU since Sandybridge, well done AMD.

Lets hope the 2600 is even better.

STUTTERS DON'T MATTER

>1080p
>1080ti

Get real you fucking retards, other than a few idiots who's gonna actually run that overpowered GPU on a $150 monitor?!

Holy shit is it hard to actually use midrange GPUs for midrange builds? 1600 + 1060/580 for 1080p for example

or overclocked 1080ti + Threadripper/Skylake-X/8700k/1800X for 1440p

Not this unbalanced shit, I wanna see the frames I WILL GET NOT WHAT FUCKING I WILL GET IF I HAVE BRAIN DAMAGE

>CPU test
Or would you rather it be done at 720p lowest settings?

No user, to truly put CPUs to the test you must use an 1080ti in 240p for maximum fps. so not even this benchmark is fully accurate

Hell even 240p might bottleneck, we must go lower.

You know, there are actually applications that test the CPU properly(single/multi thread, mixed, individual fp/int, cache and latencies) not a purposely made synthetic situation like low res on a enthusiast GPU.

Answer me, what kind of information does one get from these low resolution benchmarks? Nothing besides that how the CPU performs in that situation, a situation almost nobody finds themselves in.

Ergo the benchmarks are fucking useless since they don't provide useful information to anyone.
Also if you respond saying there's a lot of people using a 1080ti on a 1080p monitor don't expect a response, in fact get a tripcode so I can filter you.

It only matter if i watch empty sky and wall for max smooth fps

by that logic the best settings would be the /dev/null equivalent GPU that doesn't actually try to process anything, just accepts all API calls and provide the necessary feedback so as to not make the CPU wait before doing something else. You do however, need to use the high quality settings to properly model all the work the CPU would be doing in these scenarios.

Joke on intel aside, seriously the game look like shit

where is this from

b-but muh bottleneck
we must benchmark gayms at 160x120 to remove bottleneck as much as possible

You said it

nice cherrypicked benchmark

consolebabbies btfo

Stuttering actually improves the gaming experience by reminding you that the game world isn't actually real, so you don't get too involved and end up shooting up a school.

Thank you, Intel.

Maybe people still on 1080p are also on 120Hz/144Hz. Maintaining stable 120fps takes sacrifices to detail even with the highest end cards.

Few shitty ports aside, a overclocked 1080ti can push out 144FPS on even 1440p, much less on 1080p.

The 1080ti is in fact very close to 60FPS@4k, which is 4x the resolution of 1080p.

>stable
Is the keyword.

That's a 1080.

And what do you mean stable, you want 144FPS on 0.1% lows? Don't be autistic.

People are retarded, these chucklefucks will stay on 1080p as long as higher refresh rate monitors pop up, yesterday they wanted 144@1080p, now they want 240@1080p, it'll just keep going up.
Of course lets just ignore that the game looks like a pixelated mess, muh 300000 frames per second on a engine that's limited to 200

Higher resolution would actually favour Ryzen more

>g-sync

Resolution scaling >100% works pretty well as a stopgap too.

I don't care, the problem is unbalance, there's very little difference between CPUs at that resolution anyway, current testing methodologies give no useful information for anything that plays games, that's the problem.

Ummm no sweatie, DX11 is industry standard because muh legacy support, W7 markeshare prevents dx12 adoption and big studio's unwillingness to cut Microsoft money and have everyone to relearn API for honestly miniscule gains in your day to day usage.

You're either blind or retarded because there's a massive difference between CPUs at that resolution, especially if you want a smooth gameplay experience.

Virtual resolution is useful in some games, but it has been a headache for me in more.
Can't really beat native.

It will tell you how many frames a CPU can manage in a game the reader might be interested in and help him with the choice of his hardware. A 1060 can do 144hz with reduced settings just fine in most games, but you don't get those benchmarks usually. If he buys a CPU because user said that fx8300 is just fine for his R9 290 because they were both released around the same time and were both flagship parts and shows him 60fps in borderlands 2, he could feel swindled.

vulcan
that said, shit like this needs time to implement, and its a japanese studio, the fact its even on dx11 from one of them is a blessing in and of itself.

How many frames a CPU can manage with a 1080ti, if someone's not buying a 1080ti these scores are useless.

to be fair, who the fuck wants to buy anything over 1080p monitors?

fuck having to deal with shitty upscaling, making everything a blurry mess cause absolutely nothing is made for 1440p

>being this old
Check your eyes and get good.

Pretty much any game engine made in the last 5 years has up to 4k support

Why would I need 120+ fps in fucking Final Fantasy?

enjoy yout 24 fps which is just what the eye can see, i guess.

Doesnt change that movies will look like massive ass aswell, cause having to upscale shit.

Also, fuck those 27" 1440p ips monitors, god damn ips glow.

>24 fps
Try not buying poorfag GPUs.

no gpu can play 1440p maxed out with 165fps constantly.

If it can't do 144hz gaming with a 1080ti at lowest settings, it means it can't do 144hz no matter what card you pair it with. A 1060 can do 150+ stable at 1080p native on medium settings in alot of games, if paired with a 5ghz i7.

>lowest setting
Benchmarks usually run everything maxed out

>native on medium settings
Literally console tier

>maxed out
Have you played videogames on a PC ever before you were offered this job here at Sup Forums?

>intel has some issues on XV
>shills immediately divert the thread into GPU shitposting with framerates and muh resolutions
Really activates those Thought Processing Units.

>what is crysis

techpowerup is bretty gud

>ran by a pajeet

Lmao

>tfw you make a nickel for each word you write in your shill posts

Game is finished and will be out next month.

>24 fps
It's what i used to play with during geforce 2 or TNT2 or voodoo, even until doom 3/farcry 30-40fps on ultra seems to be the norm until the faggot unoptimized benchmark software crysis we have matured SLi/CFX the reviewers start pushing meme 60fps and microstutter.

Honestly, all these benchfaggots should be using box and whisker plots. They give a much more accurate representation of the average performance.

Synthetic CPU tests are fine and we use it all the time. However game specific tests shows the real performance. You can't just ignore the game test and base your theory on synthetics to judge the performance on a game that has a game specific benchmark . That's just retarded.

Any gaymen benchmark above 320x240 lowest settings = bottleneck so it's invalid

>game test
Game tests in question are synthetics since they test a unrealistic combination of resolution, GPU and CPU

g-sync makes it so it's no big deal if it dips below your target frame rate and you can just turn down the settings a bit, max settings in modern games are usually overkill

Significant differentiation at 1080p ultra settings matters for a lot of people and is valuable information. The difference at 240p matters for no one.

Can't buy a GPU in this lifetime so it doesn't matter.

elaborate.

i'd whisker her box

Intel is FINISHED & BANKRUPT!

NOOOOOOOOO

you are clueless,1080 ti has a problem keeping 60 fps in 1080 in some games when maxed out GR Wildlands AC Origins GTA 5 just some examples.

I don't understand these kinds of unironic faggots. You niggers can read a chart, yes? Every single Ryzen part in there can go above 120fps, I bet most of you don't have 120Hz monitors let alone 240Hz monitors.

Good thing I don't play newer games. Buying a brand new overkill card for a few unoptimized trash games does not seem smart to me at all.

>ryzen 5 struggling against a fucking i3
MY SIDES

They'd be lucky to make more than 3cent/word.

>3centavo
*fixed

>8700k has lower 1/0.1 than a fucking 1400
MY SIDES
shitposting aside, why does Ryzen always get drastically better frametimes even with less cores/threads and lower clockspeeds/IPC?

More L2 cache per core and fairly fatter cores I would think. AMD splitting Int, float, and load/store into their own dedicated units and setting up Int and FP with their own dedicated schedulers instead of running it all off of one bar like intel does has to help too. It's also probably why AMD's hyperthreading scales better than Intel's does.

Int float context switching historically hits AMD to almost no effect.
Intel seems to suffer when you mix workloads.

Looking at block diagrams of AMD's previous and current architectures, I'd be willing to put money on it being because AMD has separate execution units for their Int and Float hardware.

For example, pic related is Phenom II

Construction Cores and Ryzen

And for comparison, Skylake. Notice the execution units and how both floating point and integer pipes share ports, with everything crammed onto the first 3 ports.

average frame rate is all that matters..

1% and 0.1% lows indicate stutter. Basically the higher your 1% and 0.1% lows, the less stutter you'll experience while gaming which makes the overall experience more immersive.

I have a titan xp on a 1080p 60hz monitor

>these chucklefucks will stay on 1080p as long as higher refresh rate monitors pop up
Depends on what they're playing. 1080p with extremely high refresh rates is best for first person shooters and fighting games where speed matters (some even opt for CRT since the screen latency is an order of magnitude less than LCD). If you're playing shit like World of Warcraft or an RTS, a 4K monitor makes more sense.

>g-sync
Enjoy your slower response time.

Nope
youtube.com/watch?v=mVNRNOcLUuA

Whoops, this one
youtube.com/watch?v=F8bFWk61KWA

How do we solve the Sup Forums problem?

Makes sense Intel is bad on the context switch due to TLB flush and now even worse due to MELTDOWN.