Jewtel

all this waste of space...why Intel doesn't blow the igpu and make ''bigger'' core for a higher IPC?

Other urls found in this thread:

vocaroo.com/i/s1qW4yZaQb2l
seekingalpha.com/article/4051541-suerte-de-capote-intels-cannonlake-leaked-patent
vocaroo.com/i/s0pKwQ1LfWDH
twitter.com/SFWRedditGifs

They should fill all that space with L4.

tell me why onboard graphics went away? Its not like intel keeps the same socket for more than 5 seconds

Should have gotten Ryzen

You have to pay Intel big $$$ for the privilege of not having half the die space wasted by an onboard GPU. HEDT, Xeon, etc.

I mean, kill the iGPU on 4 cores high end CPU.

This DESU
Mad frametimes.

the just looks like two cpu units glued together.
Am I wrong? wouldn't that slow things down?

They cucked desktop users by selling them laptop cpu rejects.

don't bother trying to argue that. intel shills here you piss all over you claiming the igpu is well worth it over less cores. i tried a few months back arguing that the "k" series shouldn't have the igpu and it was a dozen shills trying to explain to me how that igpu was the best thing since sliced bread and how most people who buy a "k" series don't even use dedicated graphics. 4 cores ought to be enough for everybody!

Core complexes are a key part of the Zen architecture. And it works nicely

you are wrong. intel even did that before amd with ryzen. that's how all core 2 quads where made. two physical dual cores glued together.

Oh baby.

It is actually 2 quad cores put together via the memory controller. Still working good however.

to be fair r5's, r4's, and r7's are not multiple dies glued together. amd was able to fuse two dies together into one and connect it with a shared l3 and infinity fabric to communicate.

there are some advantages to this, such as increased yeilds and stronger performance (less latency since shorter distance) than intels design with just slapping two dies together on a single cpu pcb.

though thread ripper is more similar since amd is actually gluing many dies together.

igpu is only useful to save power by using bumblebee on linux and the discrete nvidia card can be powered on/off at runtime

Core 2 Quad scaling was shitty, since the cpus had to hop through the northbridge to talk to the second dual core pair.

...

see
>amd was able to fuse two dies together into one and connect it with a shared l3 and infinity fabric to communicate.
>there are some advantages to this, such as increased yeilds and stronger performance (less latency since shorter distance) than intels design with just slapping two dies together on a single cpu pcb.
also want to add, increased bandwidth as well. also infinity fabric at 2666mhz, is faster than intel's numa. with increase ram, its even faster.

>Why don't they just make everything faster.
-guy who doesn't know jackshit

infinity fabric sounds like marketing bullshit

>infinity fabric at 2666mhz
with 2666mhz ram.

though with proper scheduling ryzens design should be no different than a regular 8 core design but with shitty scheduling, its impact is much less than what intel's was with their core 2 quads

>Implying Intel wouldn't just cut off the GPU and sell the CPU for gaymers for higher clocks and shit

Thats was we want.

>was
What*

Gaymen machines aren't a significant share of the market and no one cares about them. The iGPU is useful, maybe not for you but you're irrelevant to Intel. Before it was integrated into the CPU they just integrated it into the chipset of the motherboard, now the chipset is integrated into the CPU

Dont forget, AMD is getting something like 80% yields for fully enabled dies, and are likely getting entire wafers with at least something useable on every chip coming off them.

But most software are not multicore

>pro bought 6700k/7700k

>Gaymen machines aren't a significant share of the market and no one cares about them

Only good goyim with enough shekels have a CPU without useless iGPU.

CCX design yields substantially lower latency to 8 MB of shared victim L3 and 1.5 MB of combined neighbor L2 in exchange for moderately higher latency to remote clusters.

Zen was designed from the get-go for strapping a bunch of dies together, so this choice was clearly more about scalability than anything else.

>4c intel chips were focused more on uniformity of memory access times, but that kind of fell apart when they exceeded 3 core column sets and couldn't do 6 rings to fully directly connect each distinct pair

Why is Sup Forums so fucking full of retards? Do you think microprocessors are made just by throwing transistors at the wall until something comes out of it and more performance is just a matter of making it "bigger"? What the fuck kind of a question even is that?

vocaroo.com/i/s1qW4yZaQb2l

How can intel use so much die space and still be worse than amd apus on 28nm? They must be really bad a gpu development.

In later iterations, they can just stack the System Agent, Memory I/O, and L3 Cache on top of the Cores via EMIB to save mm^2 space.

seekingalpha.com/article/4051541-suerte-de-capote-intels-cannonlake-leaked-patent

No you're not wrong. AMDrones don't want to admit it for some reason but it introduces massive amounts of latency.

I honestly don't care.

Because they use their i7 production line to make i5 and i3 processors, cutting the iGPU would just mean making new production lines which would result in higher costs for nothing.

vocaroo.com/i/s0pKwQ1LfWDH

Well there is a good thing about iGPUs. Whatever happens, your PC will be able to display something if shit goes wrong with your GPU.

Why would this be better than the iGPU? Where would you see the benefits of having such a massive L3 cache?

>wanting to get rid of the iGPU
I bet you don't even run a fucking KVM with GPU-passthrough you nigger asshole Virginia

If you want that space occupied by cores you should buy ryzen
If you want the igpu cut off you should buy 7740k

Goyim pls

Orly?

Man calm down they posted the thread to ask, to become informed.

We can't all be geniuses like you man

ITT: nu Sup Forums doesn't know how a CPU works and why "just add more space!" doesn't work

DELET

>intel shill claiming to be a oldfag
wow i guess they hire you faggots long term then

Then they would cut the gpu out, get more dies per wafer, more money per wafer, and if intel wasn't as shit a company as they are sell the cpu for less because they now have more they can sell.

its just what they call their interconnect, and because they have no real limit on how much they can connect as it takes more than they would ever put into it to break the interconnect, they called it infinity fabric.

I like how despite the bashing of newfags there isn't a single actual answer here.

Wasn't it due to yields and latency? Larger dies means more defects

that is just placeholder space for CPU's with more cores, like the 6 cores i7, or the 8-12-16-22 core xeons

and since those integrated GPU's are dirt cheap to make because of the laptop market, they are probably cost a couple of dollars a piece

it increases latency between clusters for decreased latency within clusters and scalability. well worth the tradeoff, which isn't nearly as MASSIVE as you intel shills claim.

It actually is useful. There's also no reason to make desktop CPUs physically small. Ryzen not having an igpu was the only thing that kept me away from it this year. I'm building a compute machine, I don't need a gamer gpu. Why do I have to spend 100$ on a 460 or 550 when I could have something in the cpu?

Then get a firepro dipshit.

Find me one under 100$

Wow it's almost like you can run more programs and do more shit at the same time while still running heavily multithreaded programs too. Amazing concept, I know. I guess retards that use their PC as a glorified game console wouldn't understand this, though.

ITT, niggers who don't understand how CPUs work and think you can just slap more cores on the board
kill yourselves faggots

When in doubt make more ringbuses.

The iGPU allows better heat transfer to the heatsink; otherwise the die would have been too small since 4 cores is enough for goyim.

Big pools of cache run hot as fuck, especially Intel's. I guess you haven't heard about the problems they had with the cache for their Iris graphics?

They're already resorting to using toothpaste TIM. Are you saying not even that could save them if they removed the iGPU?