Why doesn't intel make a dedicated GPU

it makes sense why Nvidia wouldn't make a cpu since intel has all the x86 licences. but why doesn't intel make a dedicated gpu? it would surely be better than the shit amd puts out at least, netting it somewhere below or at even level to Nvidia

Other urls found in this thread:

google.com/search?q=nvidia drivers kill gpus
en.wikipedia.org/wiki/Xeon_Phi#Background
twitter.com/NSFWRedditImage

Wow you're like a combo Intel Nvidia fanboy. Why don't you suck a dick right into hell faggot?

>having this much redness around ones rectum.

Apply a creme.

>AMD GPUs
>shit

Pick one.

Every GPU I've owned since the GeForce2 days has been and Nvidia GPU. I now own an RX 470 and have had zero issues with it.

Kill yourself, retard.

They could just put a bunch of Intel Iris cores on a PCB with 8GB of HBM2 and it would be amazing.

Intel is sitting on a dozen things they could be doing but aren't. They probably did a cost/benefit analysis and decided it wasn't worth it.

kill yourself

Their current integrated GPUs can't even beat AMD's old ones. Raven Ridge is going to completely shit all over Iris Pro whatever that nobody even uses.

Either you're too new or retarded to forget about fucking Larrabee. What a spectacular failure that was.

Why don't you do it?

They tried and failed already
>muh larrabee

You'd know this if you weren't an underage b& newfag

>Every GPU I've owned since the GeForce2 days has been and Nvidia GPU. I now own an RX 470 and have had zero issues with it.
This desu

I've had more issues with my GTX1080 than with my RX480, especially with my dual monitor setup. Nvidia's shit has never handled displayport daisy chains well and loves to randomly drop one of the monitors every few days. My RX480 hasn't had this issue so it's my work GPU while the 1080 is kept for gaymen

>faggots don't know that intel are failures at pretty much everything besides x86 cpus and enterprise ssds

My favorite is when idiots try and pretend AMD's drivers are still shit and NVidia can do no wrong, despite plenty of evidence to the contrary.

No one is actually this retarded

...

Because the amount of bullshit trickery that goes into GPU drivers to optimize them like they are would be insanely difficult for Intel to pull off entering the market fresh.
Their hardware could be 3x as powerful but would probably perform half as well because they just don't have the same baked-in experience that Nvidia and ATI do

>despite plenty of evidence to the contrary.
There's never been concrete evidence on either side, you fanboy retard.

Larrabee didn't actually fail, they just never used it as a GPU. Xeon Phi dies, except the newest one, still have the dedicated graphics hardware because the decision to not sell it as a GPU came late in the design process. Theoretically if someone were to build a custom PCB and firmware set for them then it could be used as a GPU.

It's the 8 narrow bits between all the cores btw, 3 ontop, 2 middle, 3 bottom, they're rasterizers.

google.com/search?q=nvidia drivers kill gpus

amd's crimson shit is a massive improvement

meanwhile we get shit like goyforce experience shoved down our throats, first thing i did installing the drivers for my 1080 was disable that cancer

Because AMD and Nvidia have mountains of copyrights they don't allow new GPU vendors into the market.
They literally have filings that copyright the most basic of GPU tasks.

They only ignore the patants for each other.

The minute you tried to make a GPU, you would have 1000s of lawyers ready to fuck you in the ass.

Yeah, I had a mini fireball once (Gtx 550 ti) and it was nothing but problems. Every card I've owned since then was AMD and never had any issues.

That's because 3/4 of AMD GPUs are all rebrands.
Rx580/470 are stable because the drivers are over 1 year old.

It's the only way they can survive with 1/2 the money and manpower of Nvidia.

Well, they tried and failed. Miserably.

Intel is still learning how to make a proper GPU through their integrated shit.

Intel management is obsessed with x86 architecture.

They forced their engineers to make an x86 GPU, engineers couldn't get it to work because of the x86 baggage, salvages some of the R&D work, and ended up with Xeon Phi, which is getting raped by Nvidia Teslas in pretty much every performance metric.

The supposed "ease of porting" advantage evaporated because Intel took way to fucking long to release Xeon Phi, and nearly all of potential scientific computation and AI customers just rewrote their code for Nvidia CUDA in the interim.

They make enough iGPUs for non-gaymen goyim. Gaymen GPU wouldn't be that profitable to bother.

Intel GPUs are actually really shit
AMD has always btfo them in terms of integrated graphics

>ITT: people think a GPU's only use is games
Nvidia needs competition in the GPU-accelerated computing field.

>1/2 the money
much lower than that as amd has to split r&d for cpu & gpu division

I heard Xeon Phi is the one instead of GPU.

In theory Intel could make GPUs and shit on everyone by enacting microsoft deals and cucking linux to death while beating nvidia and amd into smaller companies. The thing is the money required to start such a practice is horrendous, even worse to have to be held against current budget options which are a lot faster than most 4 year old cards. They'd be tapping a market that's currently on fire, the best deal they'd get is embedded dGPUs on laptops or prebuilts but they already have that with iGPUs and no one expects much from integrated graphics so they can get away with it.

op just pull the cock out of your ass and go outside.

They're also like a fifth of the size of either of their competitors so yeah.

is this a Social Media campaign to run around like butthurt amd fanboys to make everyone hate them and AMD. they are in every thread. the butthurt is so clear that i cant help but assume its fake.

#TeamRed

It would be crushed by any high end Nvidia GPU. Hell I bet even AMD could crush any Intel GPU that even resembled Larrabee.

Because they already have like 95% GPU marketshare without needing to?

interestingly enough thats the same argument that can be used against Intels Tick Tock plan. Since everything is a Marginal increase in power with each generation based on the same technology, the platform benefits from being Stable.

and why i avoid AMD cpus since they keep making big leaps at new technology and its hard to predict if its going to Pan out.

I wouldnt mind an AMD gpu but i hear they are power hungry and inefficient... and im worried that after 3 years i would spend an equivalent amount of money in a power bill that i could have bought a gtx1080 and 3 years of power. kinda like Paying Forward so you dont Pay Later.

[Citation Needed]

Some YouTuber did the math on a 290x I believe and it would cost about $25 more a year than a GTX 980 at the time.
It does save you money, but it's a super small amount.
(Depends on where you live as well)

dont forget amds longevity compared to nvidia. have you guys seen the graph with the 280x beating the 780? which means you dont actually have to upgrade as much going the amd route (saving more than you would spend on power)

>since intel has all the x86 licences
No, they don't. Fucking idiot.

They did, it's called "Xeon Phi" and they disabled the actual graphical parts for some bullshit reason. There's an article about it somewhere, look it up. IIRC the first couple versions actually had traces on the PCB for DVI outputs, though they weren't connected obviously.

>it would surely be better than the shit amd puts out at least, netting it somewhere below or at even level to Nvidia
How much crack do you have to smoke to actually believe this?

You actually think Intel can afford to blink like sacrificing a humongous amount of resources to enter an industry so hot and tight that it can squeeze you drier than a rock?!? Intel cannot afford to lose ground in CPUville. AMD can really hurt them at the most minuscule of opportunities (like how close AMD did with Ryzen)
Let Intel focus on what they do best and that is business.

>Sup Forums is now so young they don't remember this

I should buy one just to feel its aura of shit. Plus to stare and laugh at it.

Intel eventually integrated that GPU into its i810 and i815 motherboard chipsets (1999-2000 era). Those were the first with Intel integrated graphics.

Considering its a card that came out in 1998, and it offered better performance than the original Voodoo (1997) with better image quality, it wasn't a complete disaster. It's just that Intel promised it would be the second coming and it just ended up a mid-range card at best.

>shit noone bought

k

Maybe barely anybody bought the discrete card, but lots of people had the GPU when it was built into motherboard chipsets.

It was common enough that even Warcraft 3 officially supported the card.

Their cousin schlomo at nvidia has it under control

Because they're too retarded.

Intel doesn't have time to waste money researching the technology and they're not that interested in buying such technology. Their is little to gain when you become the third gear in the market. Plus production costs might not be favourable for them since they do put a lot money into their CPU department so it would be a weird sacrifice for them to switch around.

tl;dr - 2much in costs, 2little in gain

Intel's iGPUs are complete fucking trash. Why do you think they've just done a deal to licence AMD's tech for their future ones?

What good is it?

Zen HEDT CPU's are called Threadripper(good job marketing..)
Each CPU will include 64 PCI-E Lanes!
It includes 4 CCX's.
Lower SKU(Probably 12/24) 140W TDP, Higher SKU (Probably 16/32) 180W TDP.
Socket will be an SP3 LGA
Platform's name will probably be X399
Chips will be B2 revisions.
32MB L3 Cache
ES's are 3,3 or 3,4 Ghz base and 3,7 Ghz Boost
It is aimed for Retail SKU to have 3,6 Base/4 Ghz Boost
ES's that are in the wild have 2500 CB R15.
Infinity Fabric can have a bandwidth up to 100GB/s

>r7 siamese twin
i want to believe

was planning an upgrade to x99 or ryzen but looks like it might be safer to wait since both amd and intel are launching new shit later this year

a 16 core zen thing sounds perfect if they nail the pricing like they did with the 6 & 8 core parts

>why doesn't this multi-billion dollar Fortune 500 company do my idea, why aren't they doing it? The idea from me, an anonymous poster on the internet

There's a good fucking reason.

Kek, geforce experience installs system-level whitelisted node.js

literally every nvbabby is vulnerable to thousands of viruses.

>They could just put a bunch of Intel Iris cores on a PCB with 8GB of HBM2 and it would be amazing

Average intelligence of posters on Sup Forums's ''''''''''technology'''''''''' board.

cause intel cannot make gpu fag,they have to pay patents to even make a gpu since they got no skill

Also because anything they come up with is probably covered by an ATI/AMD or Nvidia patent already

>shitty excuse

>intel damage control

They tried, and when they realized that CPUs make shit GPUs they stopped trying.

fun fact: if you zoom in on a die you get an egyptian papyrus painting

They tried.
It went so fucking terribly it turned out to be the Xeon Phi

They tried and failed. Turns out making a a good gpu is not easy. The only reason AMD can do it now is because they bought ATI.

>goyforce experience shoved down our throats
>an easy to spot check box on installation asking whether or not you want to install it

Are you a retard who thought he got tricked?

Flip it the other way - Nvidia thought making a cpu would be easy and have basically failed at it.

This is the reason/thread

no, but it'd beat being such a weak shill that i had to resort to attacking people instead of product

>why doesn't intel make a dedicated GPU
Because they tried with Xeon Phi and it flopped as a GPU.

The Xeon Phi was never a GPU, they are co-processors and were never intended to be used for anything like a GPU

Because Intel's current GPU architecture doesn't scale all that well, and Intel is pretty shit at making new architectures and would rather milk an old one with better fabs

This is the correct answer. It turns out there are only so many ways to draw a triangle. These methods are all patented.

>never intended to be used like a GPU
en.wikipedia.org/wiki/Xeon_Phi#Background

Intel have onboard gpu.

Insert graphics card and it turns off.

...yeah they fkin wit you.

It's checked by default. Can't expect everyone to read all that shit when they think all they're doing is installing a new driver.

Dunno, it would be a low wattage card that would work on all platforms without fuss. It could be even passive and hell the top iGPUs are not even THAT horrible in terms of performance.

It would be awesome.

>tfw every AMD and NVIDA GPU I've owned has been fine
>tfw people actually play the company shill game

This, they cancelled it because it got pounded by the 8800GTX which was already years old when Larabee was getting close to release.

>top iGPUs are not even THAT horrible in terms of performance
They're also by AMD. Intel's iGPUs are mediocre at best. Intel is even using AMD iGPUs soon.

AMD drivers fucking suck on ALL OS.
This was always the case and no matter how many times they rename it or add such suffixes as ULTIMATE or CRIMSON or GAYMER2000, it won't change.

On Windows it's mediocre. Unless your card is 2 years old and it's "legacy" already. On Linux it's between "fucking horrible" and "fuck you". On BSD? Good fucking luck.

Intel meanwhile "just werks".
> inb4 x y z fanboy
I have an Nvidia GTX but I use the Intel card too in my laptop, a GTX in my desktop, and a Radeon in my other desktop

The only Intel iGPUs that have ever been competitive have been included on expensive Core i7 products with 128MB of eDRAM

Works great for pass thoughing in vm

Intel meanwhile "just werks".
Eh, Intel isn't that good on Linux. I thought I could get by with Intel iGPU on Linux and pass through a GPU to Windows for gaymes. I've had nothing but issues though with Intel's Skylake graphics though, both with xf86 and modesetting drivers. Horrible screen tearing on both GNOME and KDE and random graphical glitches all time. Nvidia's drivers work on my machine so I'll probably get a cheap card instead of using the iGPU.

>AMD drivers fucking suck on ALL OS.
They've gotten a lot better on Windows, don't know the situation for muh freedom OS.

That was a 3D accelerator. Not a GPU

>intel
>just werks
Clearly not with KDE, source: my very "mature drivers" 3570k

Also Nvidias blob isn't much better, half the time they shit themselves after suspend.

>Intel meanwhile "just werks".

Only because Intel don't bother with games. Intel is in no way prepared to provide the required software support for a vidya gpu.

intel iGPU just werk

and they last forever unlike 2-3 year lifespan of an amd/nvidia gpu

kaby lake igpu support 4k 60fps hevs 10bit which is all you really need if you're a non-gaymer

>why doesn't intel make a dedicated gpu?

Probably because they don't believe the revenues would make up for the development cost.

The integrated GPU was a huge win for Intel -- it basically snatched the entire lower-half of the discrete GPU market away from its competitors. (For example, notice that in the GeForce 900 generation, there is nothing below the 950 at about $160.)

Intel was able to pick the low-hanging fruit. But the fruit that remains now is higher up in the tree. To enter the discrete GPU market, Intel would have to compete head-to-head with the established players, requiring a lot of capital investment. The bean counters need to be convinced that they can grab enough cheddar away from NVidia and AMD to win back that capital. Without a convincing case, they won't approve the expansion.

It's true that Intel could theoretically use its existing integrated GPU technology as a starting point. But it would still require a major re-architecture for Intel to specifically meet and beat every performance metric of the GXT1080, otherwise there would be no point in the project.

And because of the Ryzen fiasco, Intel is playing a defensive game now, which makes it even more unlikely that they would pursue a major new market now.

Their Knight's Landing cards were destined to be GPUs but then bean counters pulled out of that game and now we have some Intel compute cards instead. Some of the earlier models even had all the hardware (sans the output ports) to push a framebuffer to a display.

Too hard to do the drivers and optimization (or really, bad code replacements) for all the games.

Intel GPUs don't last forever. Too buggy drivers that are left without support too soon. Hint: Sandy Bridge.

they are, but it's going to be intended for servers. Not sure yet if it will be for consumers

>and they last forever
Unless you need drivers that work for more than a year or two.

It's amazing that AMD can compete at all.

Jewforce experience selling your information for shekels...

It's called Iris pro and it's all you need