MOAR COARS

Can we talk about the potential of having more than four physical cores?

The possibilities in telling windows what to do with them.

At this point in time my 3570K will not overclock to 4.8 any longer. 4.6 No problem but I rekt that fucking thing ok

A 1700+ at 4GHz means basically two of the same CPU in one chip. Leaving a second half for OBS or whateverthefuck. YEah yeah fuck off back to v bladbla suck my asshole

Where do we go from here?

Other urls found in this thread:

wccftech.com/intel-coffee-lake-desktop-6-core-4-core-cpu-leaked/
twitter.com/SFWRedditGifs

Obviously Intel's "extreme" stuff does it too

And quite well

But where are the gains? Because vidya normies don't seem to be recording them

I hope for the future

>At this point in time my 3570K will not overclock to 4.8 any longer.
how much voltage were you giving it, OP?

1.345
It used to do 4.8 without asking at that voltage
That was always the Ivy Bridge thing... SB performance at lower voltage, but you pay for it with guvmint kill switch

Pic related is my CPU
It only does 4.6 at that voltage now

The gains will come.

As I've said in a couple of other threads: Intel appears to have already hit the single core performance wall. The 7740k looks like it's not going to perform meaningfully better than the 7700k.

This means Intel is going to be forced to pivot towards a moar cores apprach for the consumer space with everything after the 7740k, and, in turn, this means that Intel will have to start encouraging (rather than discouraging) developers of mainstream consumer applications and games to optimize their code for a much more multi-threade CPU environment.

For gaming in particular a lot is going to depend on whether Vega is good, as Nvidia's current software scheduled approach heavily favors single threaded CPU performance over multi-threaded, and I don't see Nvidia doing much to fix that unless Vega is competitive and Intel starts demanding Nvidia fix it as they are forced to move towards a moar coarz approach themselves.

Intel don't discorage it because if games could use more cores Intel could sell hedt to games with money.

It's kinda combination you get: mainstream or majority of market is using up to 4 cores and developers will gladly stay on less cores, because working on more threads in a effectively way is hard with gsyming.

Mang I have zero brand loyalty. Nothing
If anything I cheer for the udnerdog at times
I just need these tech fucks to move forward is all

Even so, Intel is going to be compelled to start really pushing developers (and Nvidia) to write their games/applications and drivers in a more multi-threaded way since Intel is not going to have a choice about taking a MOAR COARS approach themselves.

How well does it work setting core affinity in windows? Does it just stick to those cores and not ask questions?

Yea i agree, but don't count Intel pushing Nvidia. If anything Nvidia and Intel hate each other, Intel was trying to destroy Nvidia with larabee. Even Intel shifted to and to do their new embedded graphics.
I bet in this case Nvidia will be compelled by competition since xboner uses a subset of dx12 by amd.

So so, I really think how much devs are up to do multi going scaling under dx12. The appeal is small and fewer maybe tempered except aaa devs. When it's done right it's amazing, last hitman I got 100% scaling it's fucking amazing.

>Intel don't discorage it because if games could use more cores Intel could sell hedt to games with money.
$1000 for a CPU is just out of the question for most people, even those who can afford it would rather not.

Intel needs as much software as possible to not scale beyond 4-8 threads. It's how they've been able to sell the same quadcores all this time and make massive profits.

If software scaled up better intels quadcores would be destroyed by amd and they'd have to lower profit margins.

> working on more threads in a effectively way is hard with gsyming.
It's also hard to make a modern game in general. It's not hard enough for it to be a excuse. Plus they are doing it on the consoles.

That's why I stressed the importance of Vega being competitive, as the results a competitive Vega would generate paired with Ryzen, especially compared to Nvidia cards paired with Ryzen, might be enough to light a fire under Nvidia's ass to stop pursuing the strategic dead end of highly single threaded software scheduling.

Of course, for Nvidia they face a really difficult strategic decision on that front as well, because if Nvidia finally gives up the software scheduled approach and actually tries to go back to a more async hardware scheduled approach they are going to be starting back from initial housefires levels when AMD has already had a multi-year head start in overcoming those issues.

>Intel appears to have already hit the single core performance wall
So there's no chance that DARPA is going to bail is out of this one?

Moar cores is a meme because of Amdahls law. Most games are not CPU bound as it is. Moar cores is useful in specialised situations, but not for the average consumer. Dx12's multi core gains are most apparent on Bulldozer, not on CPUs with competent single core performance like Ryzen.

Maybe games will move towards optimizing for over 4 cores.
Not most software, however.
Laptops hold much larger market share than desktops, and more cores == shittier battery life. Hell duo core is the standard for Laptops.
If the CPU companies can find a way to make more cores more efficient, then it might happen.
The reality is that the next area for CPU improvements is cache, not more cores.

This may have been true up until now, but with AMD's Zen architecture being extremely power efficient at lower clocks you can expect to see 6 and 8 core laptop CPUs very soon.

To all the naysayers about Intel moving to MOAR CORES - It's already happening. Prepare your anuses. wccftech.com/intel-coffee-lake-desktop-6-core-4-core-cpu-leaked/

Until you get to the very low end where you have Atom CPU with up to 4 cores and ARM CPUs with even more than that.

It will take a long time for the install base of Ryzen laptops to reach a point where software developers can expect laptops to have 4 or more cores. People are holding on to their laptops longer and longer because there is barely reason to upgrade with each new generation of hardware. Its definitely a good thing that AMD is getting a head start on the more cores thing and making it viable for laptops, but I expect that people who just bought Ryzen will be looking to upgrade anyways by the time operating systems and stuff actually require more than 4 cores.

Also, if quad core Ryzens are power efficient, and laptops only need dual cores for now, most laptop manufacturer's will probably go with dual core Ryzen CPUs for even more efficiency.

Those are fair points, but even to the extent that you are correct, I'm not sure how much it will really impact CPU intensive non-gaming consumer workloads since those will more commonly be expected to be run on desktops where Intel will shortly be moving to a moar coars approach and AMD is already there, no?

And, for the most part, most of those non-gaming CPU intensive workloads already scale much better to a large number of coars/threads than games do anyway, so I don't see that being too big of an issue.

I wonder if the i3 will have 3 or 4 cores now. A quad core i3 with hyper threading would effectively make it an i7 from generations past, which would be awesome for people on a budget.

I'd imagine it would be 3 cores as to not piss off current i7 owners, and push them towards the more expensive 6 core i7's.

Preventing people from switching to Ryzen is probably more important to them than pissing off i7 owners.

I better post this while I still have the chance.

>Moar cores is a meme because of Amdahls law.
...Witch only applies to some scenarios. Even when only 50% is paralleled 8 cores is still the sweet spot. And even then it means you have more cpu power for other stuff.

>Most games are not CPU bound as it is
...Which is an argument for more cores, since they are worse in games and better in pretty much everything else.

>Moar cores is useful in specialised situations, but not for the average consumer.
No, single threading is the specialized situation. Web browsing and decoding video is probably the most common use case and is very parallelized.

>Maybe games will move towards optimizing for over 4 cores.
>Not most software, however.
Games have progressively but slowly been optimized better for more cores since bulldozer came out. But what software are you referring to? General use is already paralell, and so is productive software other than PS.

>and more cores == shittier battery life
incorrect.

Cheap ARM chips can play back and record 4k video just fine. Nobody is going to buy a $200+ desktop CPU for imperceptibly better web browsing and video playback. Your average consumer does not even buy desktop computer anymore, laptops dominate the market. Pretty much the only desktop market left is businesses that need cheap computers to fill cubicles with, content creators who need very high end gear, and gamers who fill in the gap in the middle. For the average consumer, performance is already so good that computer manufacturer's are in a race to the bottom to make the cheapest computers, not the most powerful. See: the success of Chromebooks and how Microsoft is pushing a cloud based version of Windows and even Apple is putting garbage CPUs in their laptops because nobody cares.

>Amdahl's meme
Ignores the fact games can and do have multiple independent systems (for example; AI, netcode, feeding the GPU, calculating what each player can see, ballistics, economy systems, etc)

Obviously games aren't the only workload that can benefit from multiple threads but it's the one "muh sangle ker" is spouted at most often

Add to that background tasks like drivers, downloads, maybe a music player or browser, background updates, antivirus, etc? You're delusional if you think single threaded performance is the future.

And for the foreseeable future, 4 cores can handle everything you listed with ease. Quad core i5s do better than 6 or sometimes 8 core Ryzen CPUs in a shit ton of games. More cores will be required some day, but you'll have to Just Waitâ„¢.

For everything that isn't gaming or highly specialized business class applications, both single core and multi core performance is more than what most people need so the focus is on power consumption to improve battery life.

Core 2 Quads are still good enough for casual use. Even some light gaming. The point where you needed to upgrade every few years or your PC was mashed potato tier ended almost a decade ago ..

Electromigration... Remembers me of my 3570K 4.6 @ 1.1 V

More than four cores is useless for any office/desktop task. I don't notice any difference from this system to my 5820K rig except this one doesn't run on SSD.

You'll need more cores for high-FPS gaming and conversions / rendering. For anything else your i5 is more than sufficient.

>Cheap ARM chips can play back and record 4k video just fine.
Sure, when optimized for the decoder. For example Youtube on x86 is supposed to accelerated but it more often than not just doesn't work.

The basic shit people do today doesn't require more power than a Pentium 2 or something + a video decoder. But Windows is bloated, apps are bloated, websites are bloated. Eventually people will need more power to do the same, especially once we see CPUs actually improving. It might take a long while, but eventually quadcore will be a minimum.

Sandy bridge stutters bad in games if you leave chrome open in the background. Kaby lake is going to be the same, but way sooner since there's actual competition now.

But why did the discussion shift to minimums?

Pentium 2s are too slow nowadays, I would say at least a Core 2 Duo 3 GHz. You will still feel it when you install larger programs or unzip files.

I cannot understand how there are still people with singlecore processors out there...

>start encouraging (rather than discouraging) developers of mainstream consumer applications and games
As a GNU/Linux I don't get this, never did.

Either most of the cores sit there idle and the computer isn't doing much
OR
I do something CPU-intensive and all the cores fire up and remain taxed at 100% until it's done.

This really depends on the cores. For example, the mobile phone way of 4+4 "8-core" CPUs are very efficient. 4 cores that max at 1GHz which barely use power and 4 cores that max at 2GHz for when that's needed works really well. Obviously you're not going to get the power of more than 4 cores for games. I'm just saying that more cores can == way better battery life.

>People are holding on to their laptops longer and longer
I personally believe we're about to see a big bump in laptop sales this year and that many who have seen no reason to update will be tempted. Not because of better CPUs, it's the screens that are FINALLY improving. I've been looking at new laptops for a while. 1366x768 TN panes? And the same garbage TN device on $300 and $2000 devices - unless you buy Apple (no wonder macbooks sell)? What have they been thinking? The last months I've seen a few laptops that actually have 1080p and 1440p IPS panels. IPS has become "cheap enough" to put them on laptops. This, to me, would be a reason to upgrade.

>Cheap ARM chips can play back and record 4k video
That's true, but ARM CPU-cores can't, not even close. The best can barely software-decode HEVC at 1080p. Good thing they don't need to since all ARM SOCs have dedicated ASICs for both video encoding and decoding. The "PC industry" is far, far behind here. The latest AMD RX graphics cards are finally able to do h265 hardware decoding. But no best quality Youtube VP9 for you because that would be catching up to where the cell phone industry was 2 years ago.

>More than four cores is useless for any office/desktop task
Bullshit. Any CPU-demanding task will tax all cores until the task in done. Try encoding a video and see for yourself.

I know, but that's because software is bloated today.

It's simple. Devs are lazy and will only optimize when they need to. The only decoders are relied upon on ARM is because it wouldnt work otherwise.

On x86 it already works without it, so little effort is put to fixing it.

Yes, but you cannot do anything against it. I preferred "oldapps" versions when I was still around with my shit PC and tried to use legacy versions of websites (I really miss Yahoo Mail Classic, good old times), but those aren't avaiable anymore.
Learn to read properly. I said desktop/browsing tasks, not gaming or rendering. These things will love the extra cores, same with compression.

And most CPUs are optimized for the decoder. I've literally never ran into anybody in the past few years who complained about video playback on their phone.

And I'm not shifting anything, I was always talking about quad cores being relevant because not even most AMD customers are buying 8 core CPUs, they're buying 4 or 6 core Ryzen 5s. There isn't such a huge gap between 4 and 6 cores that games that run well on 6 cores are going to run poorly on 4 cores. It's going to take a really long time before 8 core chips offer the best price/performance.