Will Intel ever make discrete graphics cards?

Will Intel ever make discrete graphics cards?

Other urls found in this thread:

en.wikipedia.org/wiki/Intel740
techreport.com/news/20222/intel-and-nvidia-hug-it-out-announce-new-cross-licensing-deal
gfxbench.com/compare.jsp?benchmark=gfx40&did1=32063337&os1=Windows&api1=gl&hwtype1=iGPU&hwname1=Intel(R) Iris(TM) Pro Graphics 580&did2=22333225&os2=Windows&api2=gl&hwtype2=dGPU&hwname2=NVIDIA GeForce GTX 970
twitter.com/AnonBabble

Rajeet is soon to sell Radeon Technologies to Intel
Its the reason why the new AMD cards are now blue instead of red

No. CPUs and GPUs are very different technology. There's a reason why Nvidia and Intel are better than AMD in their respective markets, and that's because they focus on one thing.

en.wikipedia.org/wiki/Intel740

Intel could catch up to amd in a year if they wanted to. Their integrated graphics solutions are already far better for most users.

they've tried a few times

I hope not. Intel drivers are garbage.

Intel keeps trying to make X into a CPU or turn CPU into X.

That's why they keep failing in markets that aren't computer CPUs.

better at 3 times the price

I still have a few of those running in servers.

Why would they? Doesn't Intel own Nvidia?

Wonder if they can scale up their Iris Pro GPUs and slap them on a card with GDDR5/X?

Disregard me, I'm clearly off my rocker.

Their integrated graphics solutions are licensed from nvidia dummy

Source?

I guess that's why they made a deal to licence amd gpu tech for their igpus.

not anymore

techreport.com/news/20222/intel-and-nvidia-hug-it-out-announce-new-cross-licensing-deal

>techreport.com/news/20222/intel-and-nvidia-hug-it-out-announce-new-cross-licensing-deal
>2011

None of their Skylake/Hawell iGPUs use NVIDIA tech.

why do people call them discrete? i thought dgpu stood for "dedicated graphics processing unit"

yea, they do, not directly but they do

deal ends this year intel jumped to amd because nvidia sucks at contracts

>discrete
>Constituting a separate entity or part

dictionary - magical thing

That was my first gpu, ago no less.
It was the first time I experienced the "if I bought it, it is good" it was a piece of shit.

Integrated GPUs are still dedicated, genius.

>ray tracing, etc.
Had a good laugh.

Hell no

I fucking wish. Duopolies are terrible, and the only thing worse is a monopoly masquerading as a duopoly because one of the two companies is comically inept. Having three competitors would be good for everyone.

Not going to happen though.

Even assuming Intel could produce the hardware they are in no way prepared for the sort of software support gaming needs. If Intel focused on professional stuff where the amount of fuckery isl imited sure, but for vidya? Intel would get CRUSHED on the software side. Btoh AMD and Nvidia have large teams of programmers unfucking vidya in their drivers because so many games ship broken.

Hell even today Intel's driver support is basically nonexistant - their hardware complies to a given spec and thats it.

>Will Intel ever make discrete graphics cards?
They tried.

It fucked up so badly it became a co-processor.

Why don't we use GPU sockets yet?

i thought he wanted them to match his shirt color

ATX will last forever, thats why.

Intel is rich as shit, they can afford to hire as many programmers as they like.

Clueless dumbass.

That isn't how these things work user. Nobody can help intel like that, it has to be done from scratch.

Its the same if (for the sake of example) AMD suddenly kidnapped Nvidia's driver team - these hostages wouldn't be able to do much as GCN and its driver tools are totally different.

Intel:
>Revenue US$55.4 billion (2015)
>Net income US$11.4 billion (2015)
>Total assets US$103.065 billion (2015)

AMD:
>Revenue $1.027billion(Q2 2016)
>Net income $69million(Q2 2016)
>Total assets US$ 3.316billion (Q2 2016)

Yeah, as many as they like.

They could just buy AMD for a laugh and get rid of the CPU section to avoid a monopoly.

>None of their Skylake/Hawell iGPUs use NVIDIA tech.
That's like saying none of their processors use AMD tech. Just because the sticker isn't there doesn't mean they invented it. They INNOVATED UPON™ it.

You're positively retarded.

You do realise at this level nothing is standardised right? Intel, Nvidia and AMD all have custom tools and techniques to match their hardware?

Just because you are stupid and think a gpu is a gpu is a gpu and the problem can be solved by throwing money at it doesn't make it so.

Intel could beat AMD's pajeet cards in a fight any day. Stop being such a fanboy.

Actually Intel focuses on:
>SSDs
>CPUs
>Integrated GPUs
By your reasoning they should be failing at everything, yet all their products are fantastic and they're swimming in cash.

That's all right, they're smart and competent people. They'd fix AMD's shit designs in a flash and quickly challenge Nvidia's spot as the top dog.

Call them up and tell that to them, maybe they'll remember to create a GPU that isn't half the die size without the eDRAM but 1/6 the performance.

See pic

>They'd fix AMD's shit designs in a flash and quickly challenge Nvidia's spot as the top dog.

Retards gonna retard I suppose. Hint: its almost as if different chip architectures are good at different things.

They can't be arsed, since there's no real money to be made in the GPU business. That's why they let suckers like AMD and Nvidia handle that shit.

>pajeet
surely you realise that father of pentium chips is an Indian (Vinod Dham)

I don't understand your point so far. Are you saying that no matter how much money Intel, who is rich as fuck and already develops GPUs, spends on their GPU department, they won't ever be able to compete with nVidia and AMD?

They aren't focusing their efforts on games, that's the thing. If they did, they'd kick AMD's face in.
He's a yank, not a pajeet.

Here's your top of the line GT4 Skylake with a nice big fat slab of expensive on package eDRAM, with roughly the die size of 110mm2 for the GPU part and another good half of that for the eDRAM, lets be nice and not count the eDRAM into the equation.

The 480 is 230mm2, around twice as big, and the 1060 is around 200mm2.

gfxbench.com/compare.jsp?benchmark=gfx40&did1=32063337&os1=Windows&api1=gl&hwtype1=iGPU&hwname1=Intel(R) Iris(TM) Pro Graphics 580&did2=22333225&os2=Windows&api2=gl&hwtype2=dGPU&hwname2=NVIDIA GeForce GTX 970

About 5 times slower than a 970, which is around a 480 in performance.

This means Intel's performance per mm is atrocious compared to both Nvidia and AMD.
Their GPU is just plain fucking AWFUL compared.
But since Intel is NIH incarnate they chose to make their own shitty design instead of sticking to powerVR which has amazing, dense and powerful designs.

Intel can spend all the money it wants, it doesn't have a tenth of the graphical patents or legacy and legion of driver hacks AMD and Nvidia have.
Even if they fix their software, which would take years upon years, they need to fix their shitty architecture first.

The money investment isn't whats holding them back, its the time it takes to build software tools, write driver hacks and generally deal with non compliance of DX/openGL spec. Can Intel do this? Sure? can they do it easily? Not a chance.

GPUs are completely different beasts compared to cpus and Intel lacks the experience to go back a dedicated chips like what Nvidia and AMD do.

Also Intel's igpu architecture is terrible for the most part.

How so? AMD has far more performance in their APUs.

1) Most of the Iris Pro 580 are massively underclocked because they are deployed in thermally restricted places like laptops and NUCs. Compare that to a 480 which has a bloody great cooler strapped to it.

2) Intel only optimizes their drivers for a very small number of games and applications.

When you actually do apples vs oranges, Intel is doing very well on die space vs performance. They've caught up to AMD on a fraction of the expenditure. Intel's GPU division only has a handful of engineers.

I think both Intel and NVidia are able to make both dGPUs and x86 CPUs. They just don't want to. I'm sure that in 2011 when Intel payed $1.5b to NVidia they had an agreement that Intel does not go to the dGPU market and NVidia does not go to the x86 market. They would just lose money because of the higher competition. The GPU and x86 markets would not magically grow by 50% just because a 3rd company entered.

>NVIDIA
>x86

You have no idea what you're talking about faggot

Intel should have bough S3 Graphics and Matrox.

Only AMD and Intel possess the IP for x86 processor production, it's their mutual creation and easily the most important asset either company has. won't see a third producer pop up unless one of them goes under.