Will the 2500k ever not be relevant?

will the 2500k ever not be relevant?

Other urls found in this thread:

anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/22
anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/9
youtube.com/watch?v=DxGge0tR4IM
twitter.com/NSFWRedditImage

>what is moores law?

>what is no competition

Every i5 is already not relevant if you actually play video games instead of shitposting.

>gtx 1080 sli
>on 1080p
LMAO
nice meme

An observation and a goal, not an actual law.

>muh video games
Some people actually have to use computer for things other than gaming and fapping to shemale porn.

But one that has proved over the test of time. Obviously the number of transistors isn't the only factor in judging a microprocessor though.

Not an argument
That's why I hate fucking charts. Literally no way to tell if those are fake or not, even if they're not, there is no way to tell where exactly in game they've been testing those CPUs. Witcher 3 can run on a fucking i3 but starts shitting itself when you go to Novigrad etc.

Lmao I'm playing BF1 with my non oc 2500k like a charm right now

It really depends on the workload. A lot of games are so heavily GPU bottlenecked, the CPU doing so little, that your CPU only has a minimal impact. Some other games really heavily favor more CPU threads. Unfortunately Anandtech doesn't have a good selection of games in their benches. If they included titles like TW3, DOOM, Gears of War 4, Hitman, and others it would paint a different picture.
anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/22


In other tasks Sandy Bridge is really starting to show its age vs newer chips.
anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/9

>i3 6100 @ 4.6GHz outperforms i5 2500K
laughingfrog.tar.gz

You know what else "passed the test of time"? The peak oil concept. Until we discovered fracking and it stopped working.

This

That setup better be running 1440p 144hz or 4k or idk what the fuck you are doing

>And then you actually test CPU heavy zones in the game while dipping in fps and having freezes.

I always said it and I will say it again, Sup Forums is absolutely tech illiterate when it comes to gaymen hardware mostly because nobody in here actually plays games.

I have a 3770k @ 4.4GHz and a GTX 1080 and I can't get 60fps in Witcher 3 on 1440p

Is it a CPU bottleneck?

I mean, I can but not consistently

In busy indoor areas with lots of lighting especially it drops to 40's

Why don't you install MSI afterburner run the Rivatuner and check it for yourself you illiterate retard?

h-how do I use it to see a cpu bottleneck?

- Install MSI afterburner
- Install optional Rivatuner OSD.
- Go to MSI Afterburner, settings, monitoring.
- Select all options you want to display in OSD like GPU usage and CPU1/2/3/4/5/6/7/8 usage and tick "display in OSD".
- Run the game while MSI and OSD is running in background.
- Go to the fps dropping zone in game.
- See if CPU percentile usage reaches 90s

If yes it's a CPU bottleneck, if not it's probably GPU or slow ass RAM.

t-thanks, looking at it now

not an argument
youtube.com/watch?v=DxGge0tR4IM

I used to get 70+ fps with a 1070 (oc'd) coupled with a OC"d 4690k to 4,4ghz, at 1440p.

gpu utilization is a better way to determine cpu bottleneck. ideally you want the usage to be 100% all the time when uncapped.

>i7 vs i5


lmao

You look at both desu. if GPU reaches 99% while CPU usage is close to bottlenecking it's okay. If CPU usage is in 90s while GPU drops below 99 you have a CPU bottleneck like pic related

The peak frame rate really doesn't matter. What matters is how hard the frame rate falls when you start having a lot of stuff happening.
The general trend today is that newer titles are all running on engines that can make use of more threads, and the CPU with more threads will slow down less when stressed.

Simplistic bench charts aren't good enough any more. Video reviews that show frame rates and frame times do a much better job.

this, my gpu was heavily bottlenecked by the shitty fx 6300, only 50% was being used.

I never actually experienced drops below 60 FPS.

It's not a CPU bottleneck if you have everything maxed, TW3 is very demanding. You won't get >60FPS at all times. If performance is obviously shit, like you're getting 30, then something must be wrong.

This but Sup Forums hates "youtube reviews" because those videos with actual gameplay footage always make AMD look like shit.

Your point? OP mentioned 2500k, it's a 4 core i5. i5 is pretty much dead in 2017 since games using more than 4 cores will become a common practice and we have no actual IPC gains.

I also forgot to mention that you need to look at both because if both GPU and CPU have low usage it could be something like shitty coding failing to utilize your hardware or faulty PSU.

To add, my CPU is OC"d to 4,4ghz so I'm not sure what OC the video in is running but I was just replying to who said he can't get 60 FPS in Witcher 3 with a 1080 lol.

that only applies to really shit processors or dual cores. with modern quads or above a cpu bottleneck usually shows up as occasional not fully utilized gpu while the cpu usage rarely reaches beyond 60%

My i5 3570k shouldn't be considered "shitty CPU" and the bottleneck is real in so many games.

It doesn't matter what GPU you have. You could have a fucking 1080 Ti. If you run a game that eats your average tier CPU like Watch Dogs 2 you will still dip to 40s in 1080p.

>since games using more than 4 cores will become a common practice
are we back to 2010?
2012?
2014?
4 cores will be the sweetspot for at least another 4 years.
people have been saying the same for years. only a small percentage of games benefit from more cores to this day.
>just wait! gaymes are going to use more cores next year for sure!!!

i see so many faggots trying to game on their 2ghz octacore xeons they got from ebay then complain about how their gpu benchmark scores are so slow

its to bring the cpu bottleneck to the forefront dumbass, at 4k the cpu would be asleep.

your post is worded like quad core is the best but you're actually admitting more cores are better

learn2read
4 cores are still enough. more cores matter in only a handful of games.

Like I said, illiterate as always. 4 cores ain't enough, you need at least 8 threads in 2017. I can name at least 5 games on top of my head that will bottleneck the fuck out of your i5 no matter the oc. Witcher 3 in Novigrad, GTA 5 with custom radio, Watch Dogs 2, Arma 3, Rise of Tomb Raider.

woop there it is

So you literally need to buy a modern 7 for the improvement to be enough to justify a new CPU

Literally no reason to upgrade yet

The Battlefield games, Forza Horizon 3, Hitman, Gears of War 4, tons of new games will have Haswell-E and Broadwell-E at the top of benches. Not only in highest frame rates, but highest min frame rate.

Theres a serious case for buying a 6c/12 platform for gaming.

>will the 2500k ever not be relevant?

Not as long as Intel has no competition.

>you need at least 8 threads in 2017
haha wtf, threads don't even matter.
hyperthreading is a scam for gayman

Take your opinions and go right back to Sup Forums with them.

tfw have fx 6300

What do I buy to upgrade? i5 or wait for Zen?

But only gaymen care about cores and threads.
I'm just stating 4 cores are still good, and will be good for years to come.
you go back.

5775c was a really nice cpu, shame intel made it almost impossible to own.

Irrelevant these days.

>logical threads are a scam!
>b-b-but games care about threads
>I can't even make two posts with a single coherent position

I don't know what you're laughing about. His image literally shows nearly a difference of 20FPS because of threading.

They were too expensive for intel to fab so early into their 14nm ramping up, so they halted production. Only a few lots of chips ever made it to market.

Are you a time traveler from 2010?

Welp, looks like it was the GPU after all

Do I need a bloody Titan X to run this game at a consistent 60fps or is there a problem? I did a clean install of drivers...

My i7 sandy is kicking ass still to this day. I'm still maxing games with a 670gtx. Almost tempted to post a screenfetch

I've been noticing that i5 gets stung a bit with their lack of threading in gaming, one of the projects I'm working with are full on freezing for a few seconds at a time occasionally. Maybe it's a symptom of consoles having 8cores, but zen bringing 8threads to the mainstream couldn't have come at a better time.

>and if it is none of those game is just written like shit

>that part of this quest inside the mansion when it all goes spoopy as fuck
Hearts of Stone was a good expansion.

Are you guys retarded? its a CPU benchmark.

witcher series was bretty gud in general.}witcher 1, despite the gameplay being shit, was really entertaining and had a pretty good story.