Why doesn't Intel enter the GPU market, and why doesn't Nvidia enter the CPU market?
Pic not related
Why doesn't Intel enter the GPU market, and why doesn't Nvidia enter the CPU market?
Pic not related
Other urls found in this thread:
en.wikipedia.org
twitter.com
Intel tried with Larrabee. It failed miserably.
en.wikipedia.org
Nvidia makes Tegra processors. They're not too bad for mobile processors.
Off topic note: AMD is taking them both on at the same time; a pretty interesting position they're in.
Funny thing is, despite not having an explicit x86 licence, Nvidia do own Transmeta, who made those weird codemorphing processors you could have found in a few low end netbook like computers in the mid 00s
Nvidia, if they felt like it, could dump a shitload of money into CPU R&D and push out a codemorphing processor that is explicitly designed for x86 emulation. I mean performance wise it'd probably stink, but it could feasibly be done.
Same guy here. There's stuff like Intel HD graphics and Iris Pro, but they're not quite on the same level as Nvidia and AMD in the graphics department.
Maybe Intel secretly doesn't like gamer culture.
They'd have to ramp up their gamer marketing.
Which probably disgusts them, rightfully so.
>those bazoongas
WEW LAD
Also, Intel tried and failed. Now they don't even try to step outside the proverbial box
Why? I'd imagine there's loads of money to be made appealing to gamers, if they're such Jews as Sup Forums wants me to believe why wouldn't they pander to gamers as well?
what's harder/more costly/time consuming etc. to develop: GPUs or CPUs?
GPU: hardware easy / software hard
CPU: hardware difficult / software easy
Mostly because most game programmers are from the bottom 10% of the programmer heap.
Jewtel is irrelevant to games since the first version of
XBox that featured Celeron.
Larrabee didn't fail, it just never made it to consumers. They sell em as Xeon Phi. Hell Knights Corner still has the texture units and rasterizers on the die along with a connection for video. It's just deadpinned and not exposed in the software.
That was Denver, originally meant to be ARM and x86. The x86 performance was crap and they couldn't get a license for the newer patents so it got dropped.
GPUs have a ton of repeated structure but the control hardware is insanely complex.
>GPU: hardware easy
Lol, maybe with old vertex/pixel pipelines, not with unified shaders.
"The project to produce a GPU retail product directly from the Larrabee research project was terminated in May 2010.[2]TheIntel MICmultiprocessor architecture announced in 2010 inherited many design elements from the Larrabee project, but does not function as a graphics processing unit; the product is intended as a co-processor for high performance computing." - Wikipedia
It "lives," but not as a proper gpu though.
sauce on bazonga?
The Nintendo switch actually uses a tegra cpu.
That's a negative champ.
It's the same fuckin chip mate. One of the guys who worked on it and retired shortly after did a whole overview of the project and what happened. They didn't just "inherit design elements" they straight up used the same chip. Knights Ferry is Larrabee 1, intended to be a prototype rather than a product. Knights Corner is Larrabee 2, which was going to be their first commercial product. It still has every bit of hardware needed to be a GPU on die. That's how late into it that Larrabee was cancelled. Knights Landing has no GPU specific hardware but still has the same layout as Larrabee 4 was designed to have.
Hell you can even straightup SEE the texture hardware on Knights Corner from leaked die shots.
NVIDIA and AMD hog the GPU patents of the desktop. They are a cartel. AMD and Intel hog the CPU patents of the desktop.
They all fucking patent hogs and your meme wars are stupid, you should hate all of them.
If the market was actually open, prices would be 50% lower.
That pic is making my back hurt...
Patent hogging.
/thread
Hey baby, show us your teets
Kayla Truskin
>when you wanted a toy locomotive but Mom got you an actual working railroad