Intel leaks their first consumer GPU

holy shit AMD and Nvidia on suicide watch

Other urls found in this thread:

computerworld.com/article/3086072/computer-hardware/intel-pits-monster-72-core-xeon-phi-chip-against-gpus.html
hackaday.com/2017/06/19/intel-discontinues-joule-galileo-and-edison-product-lines/
crunchbase.com/organization/intel/acquisitions
twitter.com/NSFWRedditVideo

go to bed pajeet

now this is shitposting

is this real life?

the thing is they could actually do this if they wanted to and absolutely DESTROY Nvidia and AMD but they can't because of anti trust laws.

So if I have an AMD GPU, and nvidia GPU, and an Intel GPU, can I say I have RGB GPU's?

...

About ~8 years ago, Yes!
It didn't perform well at all so Intel scrapped it as a consumer part and retooled it into Xeon Phi.

computerworld.com/article/3086072/computer-hardware/intel-pits-monster-72-core-xeon-phi-chip-against-gpus.html

Intel has no GPU patents. They would have to make a brand new graphics architecture from scratch.

What makes you think they can compete with Nvidia? Nvidia isn't like AMD, Nvidia fights dirty just like Intel, they have dedicated internet shills just like Intel does.

>every1 is bad except amd xddxdx
jesus christ kid

I literally just said that Nvidia was GOOD you retard. Nvidia and intel having internet shills is a good thing. Learn to read.

Intel could say this is a general purpose accelerator and not a GPU, but since it is general purpose it also can run GPU tasks.

If they are smart they could push out just Vulkan drivers and take cover with arguments like Vulkan being so low level that it is a natural way of using such a general purpose accelerator.

If it ain't ray-tracing, it ain't shit.

Hey buddy. Just so you can sleep peacefully tonight,

this was me

this was not

as to your question, I have no idea, I was just shitposting.

Have a good day.

It is to be expected

>in machine learning
Compute applications is pretty much all it's for now, yes.

>They would have to make a brand new graphics architecture from scratch.
Why would they do that when they already have one that's working perfectly well?

Why is it blu

>Xeon Phi
lel it's shit

>only shoots 4 bullets before needing to be replaced because that's all anyone will ever need

Would it be feasible for them to release it just with the basic drivers and let Microsoft and Mesa developers come up with graphics drivers as a way of utilizing it like a compatibility layer?

So you and 90% of Sup Forums do it for free. Interesting.

They can just polish Intel Media Graphics/Iris

>Intel
>Open Source
pick one. You have a better chance of AMD doing that. AMD has already made some of their technologies, like Freesync, open source.

I tried it out, it was so fast

>FNbabby

Freesync is royalty-free, not open-source.

I did not mean for Intel to open source anything, they would not even develop the thing. And they are active in the open source community, just look at the current Mesa Intel drivers. What came to mind was how Sandy Bridge got geometry shaders, they could try to pull something similar now.

Is this just fantasy?

Just out of curiosity, if Intel made discrete graphics cards, how different would it be from their iGPUs? Would the current free Linux drivers work with them?

I think that there would be a market for "Intel HD Graphics" that slot into PCI-E, now that AMD is viable for low GPU power workstations, but doesn't have graphics by itself.

I'd buy something like Iris Pro on a PCIe x1 card, would be perfect for desktop use with a couple gigs of ram and keeps my x16 slot free for passthrough to a vm

superb.

Their iGPUs were initially discrete cards, everything flows back to the i740.
Larrabee and descendants (pictured in OP) aren't anything like the iGPUs - they're x64+some extra instructions.

So, it's not a technical reason like having to make a new architecture or something that's stopping Intel from making discrete graphics, but rather the market?

Wow no display port? It's magic

...

when intel makes a gpu that is actually decent, that'll be the fucking day, i'd get a enema if that happened

redditor please leave

>Intel makes a dedicated GPU
>Nvidia makes an x86-64 CPU

A golden age.

>nVidia gets an x86 license
Not over Intel's cold dead corporate entity.
Why do you think they paid AMD the ransom price for their graphics IP this time around - they knew that nvidia's asking price was an x86 license and that was it.

Intel will never allow that to happen, they'll fight tooth and nail to keep nvidia out of x86.

I use AMD CPU + Nvidia GPU because the Pascal architecture just performs.

>blower fan
Good job Intel.
You fucked up the i9 series and you fucked this up too.

GPU design is simple in comparison to the GPU's specs.

Jesus christ.

Nvidya will always rule.

The Xeon Phi is just 72 Atom cores strapped together and a bunch of high bandwith memory. Now Atom CPUs that ship in laptops and stuff have HD graphics, but the Xeon Phi ones don't, which is quite a weird choice because Intel's Beignet OpenCL implementation has been quite successful on iGPUs. I don't know if they had to leave out the iGPUs here in favor of AVX512 or what, but it would be nice to have them to run OCL.

>gotta pay more for a threaded barrel

Caught in a landslide

Wait, that 's not a GPU.

It's challenging GPUs in compute, but it isn't a GPU. You can't game on it.

> Intel leaks their first consumer GPU
Codenamed "Knights Landing".

Get an AR-15

Scaramoosh scaramoosh

Because red and green were taken.

...

>but it would be nice to have them to run OCL.
opencl is one of the ways you make use of a xeon phi

Care to explain?

underage b&

How exactly is it shit? Do you even know what someone would use a PPU for?

will I finally be able to run Crysis on high?

You can just cut the extra lanes off an x16 card or cut the slot so you can put x16 cards in x1 slots

>Intel has no GPU patents
They actually do, they bought Real3D in the 90s to get a start on GPU's, Real3D was one of the leading graphics companies before the mid 90s, they were part of Lockheed and worked on things like the Apollo landing simulator
Of course, just like anything that isn't P6 or a P6 derivative they fucked up and released a subpar product that they abandoned before release, then they retooled it into integrated graphics and reused their subpar 1995 architecture until Sandy Bridge and the second generation of HD graphics were launched
They tried again with Larabee in 2007 but we all know how that ended into a coprocessor
Intel actually has a few dozen full time developers for their FOSS GPU drivers, they work on the latest chips and the community backports the stuff to older ones
Also, since they have such a massive dominance on the iGP market even third parties like Google end up backporting and developing new features for older chips, in 2013 the GMA950s got OpenGL 2.1, that's a chip released in 2005, their Windows counterpart only supports OpenGL 1.5

Will you do the fandango?

It's going to be shit like everything Intel makes these days.
Btw, Intel is pulling out of single-board computers.
hackaday.com/2017/06/19/intel-discontinues-joule-galileo-and-edison-product-lines/

They just suck at everything.

>Intel leaks their first consumer GPU
W R O N G

Also, OP is a faggit.

Pic related: Here is Intel's first GPU. AGP.

Nvidia doesn't have either a x86 license nor a x86_64 license.

The proprietary software meme needs to die.
Because then instruction set architecture will become nearly meaningless and x86 will vanish into obscurity.

...

...

Thunderbolt and lightning,
Very, very frightening me.

proprietary software encourages improvement and innovation.

>Why do you think they paid AMD the ransom price for their graphics IP this time around

They didn't, stop spreading this bullshit

probably fake

>i'd get a enema if that happened
Why wait? They're great.

Galileo

Holy shit user you're right. 3dfx, ATi, and nVidia fags are gonna get rekt! I hope it goes well with my Pentium II rig though. Will it be PCI only or have an AGP version?

I'm just a poor boy, nobody loves me.

Fucking casio didn't have an x86 license, but did anyone stop them?

Intels face when RISC-V replaces 8086 trashware in ten years

HE'S JUST A POOR BOY FROM A POOR FAMILY [...]

spare him his life from this monstrosity

If you look at intel's acquisitions:
crunchbase.com/organization/intel/acquisitions

they seem to have fallen for the machine learning and computer vision meme. I expect intel to delve allot deeper into graphics processors but always in the form of iGPU.