When the fuck are there going to be CPU's that are actually better for gaming...

When the fuck are there going to be CPU's that are actually better for gaming? It's horseshit a $1500 CPU is no better than a $300 i5.

When game developers learn to make decent games. It's really no wonder not a single modern game is properly optimised to take advantage of threading when almost all of them are made by 'developers' with shitty joke degrees in vg development.

That's not even the problem. There's a million old and shittily optimised games out there that you can't run well no matter what hardware you have. Clock for clock gaming performance is near identical across all Intel processors since Sandy Bridge, and that shit came out in 2011.

It basically means if you were CPU bottlenecked in a game in 2011, you probably are now too.

video games are for children so no one cares

I really don't understand why Sup Forums gets so worked up in Intel VS AMD shitposting competitions when they have no real use for their fucking products anyway.

What they hell are you all doing with your processors? You don't need a 5960X for medium shitposting loads, an i5 would work fine.

When they manage to lower core to core latency.
The 7700k is King at gaming because it has the highest clocks and LOWEST core to core latency

retard alert

...

Provide information to support your baseless claim.

>It's another incorrectly benched CPU thread where AMD shills pretend to oblivious to hide their's CPU shitty single threaded performance.

Rerun those benchmarks in 4K 3 years from now on with whatever modern 1080 Ti we will have then and see those Ryzen CPUs suddenly being 30 frames behind in 4K.

Literally no one has even shilled AMD once in here, what the fuck are you talking about?

It is a bit funny to hear those "100fps is fine" arguments.
In a few years it will be a larger bottleneck.
However, the raw power it offers is undeniable. I'll be picking up a 1700 for video rendering and keeping a de-lid 7700k for Muh gamin.

>Literally no one has even shilled AMD once in here, what the fuck are you talking about?

Have you been living under a rock?

But user-chan, the more multi-threaded games get, the more power the rise of the zen gets compared to the maxxxed out 7700k


So in a few years itll fare better

nace baito tho

They don't even need graphics cards either. These stupid niggers can go back.

>for gaming

Nice cherrypicking of a GPU-limited game, retard.

Not him but a few issues with that.
1. Like 2 games actually effectively use 4+ cores.
2. We've been hearing this "multi core" shit for like 6 years.

Just like Bulldozer got worse over the years, right?
Oh wait, it actually got better compared to Sandy Bridge :^)

>1. Like 2 games actually effectively use 4+ cores.

Utter horse shit. Every "AAA" release from the past few years uses at least four threads. Many won't even launch on CPUs with less. You're living in a fucking fantasy world where reality bends to your assumptions, but the real world has moved on. Late 2014 was when games first started appearing that wouldn't even launch on 2c2t chips. It's memorable due to how badly G3258 owners got cucked. In 2017 there are fucking tons of games that run better even on an i7 than an i5, due to the extra threads. Even a 2600K generally provides a smoother gaming experience than a 7600K.

it's actually better though, avg fps is false metric for cpu gaming performance

quality of life question more than raw performance

yeah but intel lolol is doing it too now, so it will happen this time

because intel is doing it game devs are gonna

Until recently there was no reason for game devs to use more than 4 cores. Nobody used 6+ cores because they were basically unaffordable.
Now with Ryzen and Intel bringing in 6+ cores at more affordable prices, there is an actual incentive to optimize for more than 4 cores.

>Like 2 games actually effectively use 4+ cores.
like 90% of games starting from 2013, and about 60% of games starting 2008

There are literally almost no games using more than 4 cores. Period. It's the same reason why a meme CPU like 7700k is still better for vidya than $1000 Intel CPUs.

are you mentally fucktarded?

Intel is releasing high core count shit now too, which means game devs WILL make it multi threaded

b t f o
t t
f f
o o

>Using meme typing
Ask me how do I know you're a Sup Forums alien.

>mad because he got btfo

maximum damage control

gta4-5
civ4-5-6
warhammer
MD
rotr
witcher
numerous car games i don't care about
MEA
dragon age origins, not that it needs it but it does
DAI
Bettlefield 4-1
overwatch of all things
Alien Isolation
prey(duh)
dishonored 2
cities skyline

top of my head, I can find more if I bother

oh right ALL of ubisoft games, yes all of them.

So is i5 or i7 going to be 6cores?

We calculate π digits with y-cruncher

Literally 80% of these are wrong. Placebo effect with 1% difference even if you remove the GPU bottleneck.

The 7700k is just the highest clocked one, latencies between cores are a meme

Get out of here pedo

Literally all of those are wrong.

Go look up benchmarks in those games of i5/i7 vs X99 CPU's, there's no fucking difference.

You realize high end CPUs are aimed at video editing and servers

Why can't they fucking aim them at people who play games?

People are more than happy to dump $1000 into a 1080Ti, why not a CPU? All it needs is stronger cores.

This.

The CPU's ARE faster, it's just that these lazy ass devs never try to use it's full potential because of money and time.

The 7700k is still faster in 90% of these "multi threaded" games compared to 6 and 8 cores.
Rendering your argument wrong.

>latencies between cores are a meme
Explain why lowering core to core latency in ryzen makes a huge difference then?

Because we are at the end of 'stronger cores' you fucking retard.

Multithreading is the only way to go forward.

Fuck you faggot, I want stronger cores.

Get Intel on the phone.

>a huge difference
Show me please.

feels like a samefag, but whatever

they have at least 20-30% more in min. frames
all that matters.

>they have at least 20-30% more in min. frames
>all that matters.
That's a HUGE difference

Also, MOOT confirmed Summer Fags where a myth.

>they have at least 20-30% more in min. frames
Stopped reading.

This entire thread is based on a misunderstanding. CPUs don't seem to make a huge difference because games just don't require relatively many calculations to be carried out on a CPU compared to a GPU.

If you wanted CPU to matter more, look not for more powerful processors, look for more demanding games. Message game devs, not proc manufacturers.

CPU's are already too powerful for most games, GPU is the bottleneck in every current high-end system. Get 2 1080ti's in SLI and see your 'better' frames.


>Has a 60Hz monitor, games at 200fps and talks about how 'smooth' it looks.

...

That's clearly a cherry picked example although that is a specific instance where CPU does matter I guess. Do you know what GPU is being used here?

I know this is b8 but no monitor can display that anyway

>testing CPUs in a GPU limited scenario
Where does this shit come from so I block the website in my hostfile?

Its Linus Tech Tips m8

>When the fuck are there going to be CPU's that are better for gaming

This happen over a decade ago. It's called a GPU and generally isn't limited by CPU.

Are you fucking retarded, OP? Or just stupid? Or both?

Aren't they supposed to be somewhat serious?

How is it b8? CSGO on that 8350 would be literally unplayable for any serious player. You've got to remember those are averages, more than likely the 8350 would be dropping down into the low 100's, Ryzen may even drop close to 200 too. In CSGO you need a stable 200FPS, minimum.

>Are you fucking retarded, OP? Or just stupid? Or both?

Interesting question.

>replace GPU with one 2x as powerful
>framerates are still garbage
>check CPU usage
>100% on one core

Who's the retard now?

Still you for bringing up an issue based on poor programming for multicore/multithread CPU's and not an issue with design of CPU's.

>Game engine that came out in 2005

So what, the API is shit, how is it related to CPUs? Shouldn't they bench on more recent game that take advantage of CPUs ?
What's the point of this retarded benchmark, linus?

The difference between the 7700k and 1800x dumbass, no monitor can display more than 240 hz

So you're saying I have no reason to be asshurt that someone can drop two grand on SLI 1080Ti's and still get drops to 30FPS because Intel are too lazy to make better CPU's in 7 years?

It's not about displaying it it's about maintaining as high a minimum as possible. Even on a 100Hz panel you can still feel the difference between 150FPS and 300FPS, and Source engine is fucked up and will mess up your mouse sensitivity as the framerate changes, so it's important that FPS stays as consistent as possible.

Fucking retards.

So a faster cores wouldn't solve the problem? Why don't you kill yourself?

You should be butthurt devs don't take advantage of CPU processing power. Multicore/threads CPU are the norm now and APIs have advanced since 2011.

Tell Tomb Raider devs to patch their game so it takes advantage of your CPU.

...

If you have more than one core and the game isn't utilizing it. Blame the dev's of the game. Dumbfag.

Yeah but.

>go to BIOS
>underclock CPU
>framerate goes down
>vise-versa

Saying 'blame the devs' doesn't help anything, the only solution is better hardware.

Haha, normally the dumbfags are all about 'team red is better, no team blue is better' but you are on a whole other level of dumbfaggery.

Imagine you have 6 ovens and need to bake 6 cakes on a deadline. Do you act like a retard and try to get everything done on a single oven, or use all 6 you already have?


Irony partially intended.

>30 frames behind
They're not even 30 frames behind now at 1080p 144hz, retarded frogposter.

Not all computations can be performed in parallel, it's not fair to put all the blame on developers.

But if I'm the worker, and my faggot boss is telling me I can only use two ovens, I have no choice but to use two ovens. Thus, getting two bigger ovens would be better than throwing more useless ovens into the equation that I can't use.

Not only that, but retard frogposter assumes newer games won't take advantage of the extra power of new GPUs, which is fucking stupid.

Couldn't agree more, as an electrical engineer it's actually quite sad to see newer, more powerful hardware, always get paired with shittier software that is most likely written by idiots who never did computer science. The fact that any idiot can pick up coding and learn it at home as long as he has a computer is exactly what made it so shitty.

And yes, the degree is important. It's about the mentality, the math maturity, the way you'll solve your problems. Any monkey can code, it isn't something hard. But actually knowing what you're doing and making sure you're extracting the most performance isn't so trivial.

>Ovens
I like where this analogy is going.

>It's really no wonder not a single modern game is properly optimised to take advantage of threading when almost all of them are made by 'developers' with shitty joke degrees in vg development.
Pretty much.

>I'll be picking up a 1700 for video rendering and keeping a de-lid 7700k for Muh gamin.
This is what most people do.

JUST

WAIT

Wintel forever gayyyymd poorfags btfo

Pajeet CPU is for pajeet

Every core is it's own CPU, any load on a single core, can be shared with multiple cores as long as the software is programmed to take advantage of it. There are no computations that can't be shared across multiple cores/threads, optimization for it is difficult, but considering the fact that we now have 8 core consumer grade processors, it is unacceptable.

Big game development studious should hire more devs specialized in multicore optimization. Instead they chose to cherry pick multicultural developers for fill some sort of 'racial rainbow quota' and the best contributions those racefags have are "I think the game should have a black female protagonist, the story and gameplay are irrelevant "

CPUs aren't faster
They have more cores
Video games are not very compatible with parallel processing

you have no idea what you're talking about

>Many won't even launch on CPUs with less
Literally Farcry 4 and Primal, and there's a special .dll to bypass that since a long, long time.

>Why can't HEDT CPUs aim at gaming?
Because videogames and multithreading past 4 Cores doesn't play well together 90% of the time, and that only counts for those games that do have multithreading.

Intel's reached the pinnacle of single core performance so they're following AMD's trend, now we just need GameDevs to step their game up and make games with proper multithreading.

>There are no computations that can't be shared across multiple cores/threads
there are computations that become slower when shared across multiple cores, so that statement isn't really true

>CPU Manufacturers are at fault for my shitty game not being able to properly use my CPU, even though they didn't worked on the game at all
That was your statement. Read it, process it, and once you understood/embraced your own retardness, remember to say goodbye to your loved ones before fucking killing yourself, thank you very much.

>tfw still have an i5 2500K and GTX670
:3c

>Intel's reached the pinnacle of single core performance

So you're saying 2500K is going to be fine in 2030?

That totally depends on Intel's next architecture, they were gonna launch it on 2021 or 2022 iirc.

Some 4X games have multi-threaded AI, so in a handful of cases you can see benefits today

>only one amd cpu makes it onto the list
like pottery

>Only one brain cell made it into this guy's head

Stop playing shit games that don't use your hardware properly.

I'm pretty sure the main problem is DirectX being a piece of shit, I swear every game with performance issues and inconsistent framerates runs on DX11.

More like splitting 1 big cake into 6 in order to fit then in the ovens

>Video games are not very compatible with parallel processing
And yet sony and microsoft went with octa cores for their systems instead of highly clocked dual or quad cores

This is still relavant today. Many shit PC ports still use only 1 or 2 cores to their "fullest" potential and ignore the rest for the most part just like an engine from 2005

And yet you still see babbies whining that Vulkan/DX12 sucks (because it doesn't work as well with NVidya's lack of hardware scheduler)
Last time I remember this happening (Batman Arkham Knight) there were mass protests and refunds.

>And yet you still see babbies whining that Vulkan/DX12 sucks (because it doesn't work as well with NVidya's lack of hardware scheduler)
AMD doesn't see big improvements either. Look at Return of the Tomb Raider. DX12 actualyl runs slightly better on NVIDIA if I recall correctly (but both worse than DX11)

Vulkan is actually good, I have zero faith in DX12.

We don't need better CPU's for gaming.
We need better games for CPU's.