How much does CPU performance effect game performance these days?

How much does CPU performance effect game performance these days?

I have a good ol' 2600K with a standard 4.5 Ghz overclock. I feel no need to upgrade. But if for instance I bought a GTX 1080 would I be looking at any bottlenecking in game performance? In this case I would be talking about 4K gaming generally.

Other urls found in this thread:

thebottlenecker.com/
twitter.com/SFWRedditImages

its makes a big difference and buying a 1080 with a 2600k would be retarded

CPU is very important. But you might be better off upgrading your CPU first, depending on which GPU you are currently using.

Depends on the game really, some are more cpu intensive than others.
thebottlenecker.com/

That CPU is more than enough for every game, get a new GPU, and in 2/3 years when it starts showing it´s age maybe start thinking about a new CPU.

PROCESSOR PERFORMANCE TO THE FOREFRONT

it depends on the game and what refresh rate you're targeting. High refresh rate monitors have become popular recently necessitating high-end CPUs. If you're just playing at 60Hz just about any semi-modern i7 or Ryzen CPU will suffice.

Alot. When I got into pc gaming I had someone build a budget rig for me. It was shit and couldn't run most things smoothly. After becoming a bit more knowledgeable on pc i decided to save for an upgrade and bout a gtx 970. My pc only got slightly better and I still couldn't run things as smooth as I want. Its only after a year or so of research and money saving did I build my own rig with a new mobo and an i7. The cpu is what makes a massive difference in performance alot of the time. I still use my 970 and can run most if anygame with high fps.

it depends on what games you're playing. some shitty mobile port isn't going to need a strong CPU, but if you're emulating Wii U or playing a modern MMO then you need a decent CPU.

Are you me? Same processor and same overclock but I'm still rocking a gtx680. Can't upgrade b/c new gpu's are expensive as fuck and I'd have to get a new cpu and mobo to make it worth it.

Not so much because of how bottlenecked current consoles are. If the next generation has a big boost in cpu performance, then expect games to start taking advantage of it

Always buy i7. If you did that when you bought that i5 you would still be good.

That is an i7.

2600k is fine for 1080 at anything above 1080p

Can someone explain me what is that?

for 4k just drop your pc on the trash, you are fine with 1080p

>buying a 1080 for 1080 gaming

It's a CPU!

Its excellent for 144Hz 1080p.

A what?

>thinking resolution is the only factor to consider

A seepy you

You guys are absolutely - absolutely fucking retarded.

Your GPU is almost always the bottleneck in video games. Go have a look at some benchmarks, the only time your CPU is a bottleneck is when you are running at low resolutions and trying to get very high framerates.

So if you are playing a game at 1080P or under and you have a high frequency 144Hz monitor, and you are looking to get like ~140fps. This is like really niche.

If you are playing games over 1080P, then the differences in any decent CPU from the last 4 to 5 years is negligible.

You don't need a ridiculously fast CPU for games. 95% of the time you are better of spending extra money on a better GPU.

>Go have a look at some benchmarks
But GPU benchmarks are always done with the most modern high-end (gaming) CPUs.

Heck I have an i5-750 QC clocked to 3.6GHz, and I can still run modern games on medium/high and get 100/50 fps. I have an RX480 with 8 GB VRAM which helps I think.

I think your 2600K is good for at least 2 generations.

And 2600K launched 7 years ago.

This user is a shill. Never buy Intel unless you like wasting money and meltdown.

I think you're missing the point. You don't buy a GTX 1080 to play games on medium at 50 fps.

Where is the AMD CPU that can match an Intel for gaming?

They will often test newer CPUs against older ones in benchmarks. Just make sure you look at the resolution they are testing at. Specifically look for 4k or 1440P benchmarks if that's what you are running.

>You don't buy a GTX 1080 to play games on medium at 50 fps.

Sure, but with a 2600K @ 4.5 GHz I refuse to believe the CPU bottlenecks the 1080 GPU to such a degree that it matters. I think with an 2600K and 1080 you'd get a comfortable framerate at high settings in most games.

I'd suggest buying the 1080 and buy a new modern chipset/cpu/ram in 2020 and use them with your 1080.

Yeah enjoy that extra 2 fps when you pay twice as much for a processor, lol. Then Intel releases a new motherboard after a year so you can't just upgrade your processor in the future.

Looks like you were jewed hard.

>I refuse to believe the CPU bottlenecks the 1080 GPU to such a degree that it matters.
But why? You're talking about a GPU designed to run games at the limits of technology. That's what you're paying for. Why would you assume that a CPU that's 5 generations old would also be able to run games at the limits of technology?

Wires hooked up to every single pin on the cpu

Those are all modern CPUs and that's only one game. Some games are more CPU heavy than others.

does this remind anyone else of the pic of the exposed horse hoof?

Thats a bad benchmark. You need to choose an open world game or one of the latest Cryengine titles. CPU matters quite a bit in those games

I'm glad I'm not the only one. Disturbing.

>Why would you assume
SO YOU BE SAYIN
if you read what i actually wrote

>I think with an 2600K and 1080 you'd get a comfortable framerate at high settings in most games.

Sure if you are going to run games at very high or ultra 4K in VR it matters. However, why buy an older card to let die with the PC than a 1080 which can be used in the future if OP decides to get new mobo, cpu etc after a year or two.

Yeah they are all modern CPUs?

Media tries to push that Intel is way better for gaming.

>paying double the price for an extra 5 fps

>foal_hoof_removed.jpg

I used an i7-975 for years and thought I was great.
Totally wrong.

CPU is VERY fucking important.

>In this case I would be talking about 4K gaming generally.

At 4K, you're limited by gpu at the moment. The GTX 1080 will always be the limiting factor, unless you decide to use a really old and crappy phenom or pentium for example.

Doesn't look like a CPU, looks more like a northbridge/southbridge controller on a laptop.

CPU is always the most important for simulation heavy games. IE: flight simulators, warfare simulators (steel beasts, DCS, Sturmovik, rise of flight, etc) or games that generate massive numbers of individual units on screen (total war series).

The ARMA/Operation Flashpoint series still performs like shit because the devs can't seem to optimize their games.

user, i had a 2500k till i recently changed it. Amazing CPU, lasted me almost an decade.
But no. You can get a 30-40% increase in framerate by upgrading your CPU if you have a 1070 or better GPU
Just because the GPU is the MAIN factor, it doenst mean a CPU, specially a 7 year old one, doesnt matter.

It barely matters at all, most games are not CPU intensive, and most don't take advantage of lots of cores
If you want 100+fps then you might need to consider a better CPU
If you want to emulate then consider a better CPU, emulation is pretty much all done on a CPU
If you stream then you might want to consider something a bit better with more cores
A 2600k is an excellent processor, and that overclock is plenty high, honestly if that were what I had, I wouldn't feel the need to upgrade at all
t. i7-920

it really only matters if you care about >60Hz monitors. 2600K should reach 60FPS with a 1080 in most titles no problem. If you're just gaming and you want to upgrade and go 60FPS I'd go for a R5 1600 or i5 8400, if you want to go 120Hz or 144Hz I'd go for an i7 8700k.

GPU doesn't let me emulate shit though. Fuck you people.

I can assure you that it cant get 60fps@4k like he wants for anything past 2014

That's why I said a decent CPU from the last 4 to 5 years.

>4K
why
seriously, what the fuck is worth playing at a resolution that high
t.amd APU

>R5 1600
i got this and it's fucking great

reminder to get fast RAM, as in 2866MHz+ with CL14 because CL16 has a nontrivial impact on ryzen performance and is also for fags

>not futureproofing for the age of every game is heavily threaded
>not todayproofing for the age of most games are already heavily threaded

wtf is this picture? It looks absolutely grotesque.

My same thought

CPU doesn't matter so when you go over 1080P.

I'd get a mid range AMD CPU like a 1600, they're still brilliant at games, but not ridiculously expensive, and you can upgrade easily as the AM4 motherboard is going to be around for two more iterations of Ryzen, until 2020.

You like looking at pixels instead of the game?

what the fuck are you even talking about?
games look more than fine at 1080p
consumerist faggots like you killed this hobby and pc building in general.

I just like being able to see what's in the distance.

W-why?

That doesn't even make any sense

Something like that is only done after the horse is dead

gross, at least it isnt that beehive skin disease people get on their foot with all the holes on it

also what about 1440p 144hz? I have a 8700k coffinmeme im also running it on windows 7

Thanks user I was legit sad for a bit there.

So what the fuck is this picture can someone please tell me

>why would you want to play at a higher resolution than 960x540?

>what the fuck is worth playing at a resolution that high
shooters, especially ones like pubg

I see, I wonder for what purpose

someone bent the pins on their cpu so they just soldered wires to connect to the socket
I agree. I just play at whatever the native resolution my screen is.
end it

what the fuck? why would you post this?

So, having a decent modern cpu is better than having an out dated cpu. Gotcha. Fucking brainlet.

Why would you waste your time and energy like that just buy a new board

OP is talking specifically 4k, no such thing as a 30%-40% increase from a cpu at that resolution.

Not everyone is a consumerist faggot like you user

Buddy the only way in hell someone can see pixels at 1080 is if they got some fucking cybernetic eye implants. That or they are so extremely autistic they spend at least 80% of game time staring at tiny details you wouldnt even notice in real life instead of enjoying the game. Then getting butthurt if they see something unlife like which is reasonable seeing how it's a fucking game. Seriously when will gaming graphics finally reach a point where it's enough. I for one don't want games to look perfectly like real life.

>95% of the time
That’s what makes this post retarded. Games come out all the time that are cpu-intensive. You can get by with a 2600k but your performance is going to be hit or miss by a pretty large margin.

Jesus fuck I’d rather kill myself. It’s stressful enough just trying to screw the fucking fan on.

who gives a fuck about playing on ultra
good looking games have good art styles, not graphical fidelity

...

...

gpu memory>16+gb ram>cpu clock speed

You can't see any farther
yes, the higher res does look better, but not several hundred dollars worth better

>I for one don't want
Then save your money and enjoy your games at low settings.

No amount of AA is going to completely remove aliasing artifacts and even with flawless AA you're still getting something blurrier. And max quality AA requires a good bit of GPU muscle itself anyways.

If you want to live an inferior lifestyle I’m glad you’re doing what makes you happy. But I am having more fun than you.

Maybe you should just buy a Dell once every 7 years kiddo

same

>But I am having more fun than you.
Not him but you're a retard.

Enjoy being the definition of a good goy

No him, but it's does make sense.
becuase the picture is more sharp(but you need a bigger screen)
But you can see small details better.

The only meaningful difference I see here is that you've increased the draw distance of some vegetation, which has nothing to do with resolution.

My first rig was an i7 970 rig and I agree. I only have 8GB of RAM and the only time I've had framedrops as a result of pushing my rig was when I was running through Novigrad in The Witcher 3 at 4K (and in that game, if you aren't cranking the settings, you're doing it wrong). Granted, I had to run stuff like Mankind Divided at medium-high settings, but that game was just badly optimized and everyone knows it. Having 4 cores at 4.0ghz + hyperthreading is pretty sweet when you just need to crunch through with raw power. Great for emulators and shit, too. Well, except VBA, because there's no way (that I know of) to cap how much you speed the game up by when you FF.

who gives a shit about seeing a fucking bush in the distance?
How is that worth buying a new mobo, cpu, gpu, monitor and a higher dpi mouse?

This doesn't help much for people who aren't using 4K monitors.

This is my current "rig"
Say something nice about her!

I have a 7600K and a 760GTX graphics card.
The 7600K is always the limiter to performance. Always.

just kidding, chinese are collecting hoofs and the more pain an animal suffers from while this procedure is done, the more money they can get from their customers.

>effect

The fuck else you think people build gaming PCs for? Pussy?

All the people buying shitty consoles, lootboxes, crappy rehashed AAA games, and I’m the goy for maintaining a pc? Fuck, time to neck myself.

savage user. well done

In that game, it does. Higher resolution allows more things to be rendered in the distance. It's not a graphics setting, it's how the engine works.