Intelcucks on suicide watch

Intelcucks on suicide watch.

Other urls found in this thread:

tech-critter.com/2017/10/intels-gpu-licensing-agreement-with-amd.html
ranker.sisoftware.net/show_system.php?q=cea598ab9fa89aa99fb9dee3ceffd9ab96a680e9d4e5c3ab96a385fdc0f1d7b2d7eadafc8fb28a&l=en
mobile.twitter.com/ryszu/status/896424078046892036
twitter.com/NSFWRedditVideo

tech-critter.com/2017/10/intels-gpu-licensing-agreement-with-amd.html

nOIOOOOOOOOOOOOOOOOOOOOOOo

It's real. HAHAHAHA

nVidia on suicide watch, film at 11.

ranker.sisoftware.net/show_system.php?q=cea598ab9fa89aa99fb9dee3ceffd9ab96a680e9d4e5c3ab96a385fdc0f1d7b2d7eadafc8fb28a&l=en

Holy kek Intel are literally beggars.

why did amd do that? why the fuck on earth would you rent your ip to shithead jews, after publicly denying it?

WHAT THE FUCK

what the fuck

Hopefully this is a shoop

It is.

HAHAHAHA
INTELFAGS REALLY ARE CÜCKS

i really hope the stock doesnt dip much :^)

Isn't the Vega GPU one of Raven Ridge's most competitive features? It wouldn't make any sense for AMD to license it to Intel too when they need all the mobile market they can get.

>ITS FAKE GUYS. PLS ITS FAKE

How does this proof anything

AMD needs to do whatever than can in order to salvage the train wreck that is Vega.

>Video Adapter: 694E:C0
HUR DUR

They are already doing it by simply writing the software.
It's a combination of price, Zen cores and iGPU.

But how does it proof it's amd

>what is 0x1022 pci vendor id
cucks, retarded intel cucks everywhere!

is this impossible to spoof?

Tesla excluvise APU? Apple excluvise APU?

>why did amd do that?
600m that intel paid nvidia for same thing last year

600 million? WTF

Now developer need to optimize for vega, no more gaywork

To me, this suggests that AMD really is trying to choke Nvidia out of the gaming market. They got their parts into the consoles and that's what people thought... well if AMD GPUs are in the consoles and are sold with every Intel processor, a game developer would have to be a retarded toddler not to optimize for AMD GPU. Literally every device capable of playing a AAA game would have an AMD GPU. Some would have an Nvidia card, but it would ALWAYS at least have an AMD iGPU.

I really want it to be true for the lulz

from a gpu oriented point of view, it is plausable.
if we look at the other side of the coin, the cpu side, it looks like amd tries to undermine itself by offering intel a capable gpu for its cpu, while amd can savor only for its cpus and crush intel in the mobile market.
but it seems you are right. amd should be thinking it cannot beat intel swiftly in the mobile cpu market, thus they turned their wrath to destroy Nvidia.
good analysis.

vega11 where?

next year. prim shader drivers have to come first.

Q1 2018.

No, it's simply a custom SKU for Apple.
And no, AMD is moving as far away from developer "optimizations" in Vega as possible.

>le magic driver

17.320 is with vanguard program participants, they are under NDA.
now you can shitpost about this also.

Ditching D3D truly is a magic unknown to taiwanese.

the picture in the OP doesn't look as if it's a custom Apple SKU. That definitely looks like general Intel marketing material.

>AMD is moving as far away from developer "optimizations" in Vega as possible
I don't know where you got that information, but Mantle, Vulkan and DX12 have tools specific to AMD that require developers to program custom code for maximum performance. Primitive shaders, for example, have to be implemented in a case by case basis for them to be a real benefit. AMD said that they work even if developers don't make custom implementations, but in much the same way that open source GPU drivers "work."

>Primitive shaders, for example, have to be implemented in a case by case basis for them to be a real benefit. AMD said that they work even if developers don't make custom implementations, but in much the same way that open source GPU drivers "work."
Why would you feel the need to post if you have absolutely no fucking idea what you're talking about?

He's either baiting or retarded.

If you can't read between the lines on what AMD has said about the shaders, that's your problem, but the fact that they're still not working in the driver and that AMD has said they're going to allow devs direct access later should tell you something.

Are you fucking retarded?
They simply have no idea how to expose it to devs now.

It's not working in the driver because they're trying to make it transparent, which seems like it's not going to happen.

It is going to happen though.
Stop shitposting about something you know nothing about (also citing Ryan is fucking retarded).

>enabling features is literally magic

>it is going to happen though
Probably. But the amount of trouble they're having getting it to work in the general case and the fact that they're going to open the shaders to devs later makes it painfully obvious that the transparency is going to be just enough to get the shaders to function on their own and anyone who wants the real performance out of them will have to implement them on a case by case basis. That's how this stuff has always worked. It's why people talk about optimization all the time.

>citing Ryan
Dude I'm citing an RTG engineer. Maybe I'm not the one who's shitposting here, huh.

Devs can't do shit against the driver.
The driver needs to go full retard (which is not going to happen) and dev needs to be smart enough to do it the right way.
Primitive shaders were never intended to be touched by the devs.

nah, this is all about marketshare

with amd and intel cpus having vega that would rocket vegas use into the 70-80$ total gpu market realm.

While devs may optimize more for nvidia now, but if they want to hit 70% of the market at once, amd would be the way to go if this partnership is real and keeps up.

>devs can't do shit against the driver
Unless AMD wants them to. Which they said is an option. So when you say they were never intended to be touched by devs, you're just wrong.

>unless the driver goes full retard
the driver doesn't even work several months after release. If that doesn't constitute "full retard" for you, I don't know what to tell you. All signs point to AMD handing the shaders over to developers even if they get it working. It makes no sense to take a programmable shader and force generalized programming on everyone, including those who could make better use of them. ESPECIALLY when it's clear that handling the shaders in the driver isn't working the way AMD hoped it would.

Rys LITERALLY said they are NOT planning to expose them any time soon.
That alone nukes your statement so you can pretty much eat shit.
Btw they've started the development of goymen drivers mid-July.

Again, why are you wasting your time typing up replies and our time in reading them if you have absolutely no idea what you are talking about?

Primitive shaders are one specific implementation of the more general programmable geometry pipeline in Vega, and it is the programability that RTG is trying to find a way to explose to developers in a way that won't 99% of the time be worse than RTG's default NGG fast path implementation (i.e. primitive shaders).

Even a cursory reading of the whitepaper would make it obvious that primitive shaders will be developer transparent, and if you haven't bothered to read the whitepaper, we return to the question of why you are wasting everyone's time spouting nonsense?

>more made up asspulled nonsense
(You)

You have taken the blue pill. He is quoting engineers at AMD. You are simply forcing your hopes and dreams.

(You)

(You)

>Rys LITERALLY said they are NOT planning to expose them
mobile.twitter.com/ryszu/status/896424078046892036

He said the option is on the table and that was before everyone started asking questions about why the primitive shaders weren't working. At this point, the drivers are so late that it's in the best interest of the architecture to allow developers to start working with the shaders regardless of whether or not they're transparent.

>Btw they've started the development of goymen drivers mid-July.
They had the silicon over eight months before that, don't try to feed me bullshit about they didn't start driver development until July.

Can you give me a good reason why AMD would restrict API access to primitive shaders when they've spent the past five years exposing every feature on their GPUs to the API? Everything from the scheduler to multi GPU is exposed to the API now and you're saying they want to restrict access now? Can you give any reasoning to back up what you're saying?

Goddamit finally someone pulled a Larrabee in the actual GPU form and STILL there's at least one retard shitposting.
>NO PLANS RN THOUGH
B&

Thank you for admitting you are knowingly and deliberately lying by quoting Rys explicitly proving your nonsense wrong.

user, Intel and AMD worked together since the 90's, both of them worked out so Cyrix lame and CHEAP processors would die

if AMD does something they share it with intel so Intel may improve it and the list goes on.

Cyrix died because of their own incompetence.
Same with 3dfx.

that too, but it doesn't remove the fact that AMD and Intel played on Cyrix's playground and BTFO them back to nothingness

3dfx was pure stupidity

Nowhere in the whitepaper does it say that they're meant to be transparent. That came from the RTG engineer and he said twice that allowing direct access to the shaders is on the table, but that the focus is currently on making them transparent.

That was two months ago. If I said "I'm not going to eat right now" and two months later someone said "that guy had a nice meal" and you said "HE'S SAID HE WASN'T GOING TO EAT" people would rightly understand that you're an idiot. Plans can change. You're grasping at straws and haven't made any argument at all beyond what AMD said TWO MONTHS AGO. And here we are and the shaders STILL DON'T WORK.

Do you understand what people mean when they use the phrase "something's gotta give?" It's highly applicable here.

It's not AMDs fault K7 was so good.

>what an engineer said two months ago before the cards even launched is still 100% true and applicable right now

When your view of the world is that rigid, you really can't be expected to understand nuance, so I guess you do you, bud.

(You)

the best thing about this is optimization.
fine tuning games for hardware peculiarities on consoles and now for the huge marketshare from laptops.

nvidia can only hope to turn pc gaming into a ultra elistist rich fag niche and deny entry level hardware from being perceived as remotely "capable" (capable being defined in their own terms) or go full contra revenue on twimtbp and dish out some 10-20 billions trying to buy 5 more years or so in that space, until their deep learning hardware doesn't become their main source of income.

I was speculating and stated it as fact, but don't pretend you can speak with authority based on something one guy posted on twitter two months ago before the product even launched. Be real for a second.

>Making stuff up is more reliable than what RTG engineers said themselves and haven't subsequently contradicted.
(You)

>I was speculating and stated it as fact
Thank (You) for admitting that you're knowingly and deliberately shitposting.

It's not

I was trying to have a conversation, but as usual, you guys are too childish for that. If I'm wrong, you should try to tell me why I'm wrong without talking a whole bunch of shit or making weak ass arguments and getting upset when I say why I don't accept that argument. Some user brought up the whitepaper like three times and the white paper doesn't use the word transparent once. It doesn't say at all how the feature would interface with software, in fact... but I'm the one "deliberately shitposting." Right.

I'm not shitposting here, son... you apparently don't know what shitposting is, but you've managed to derail the conversation into shitposting, so grats. Take pleasure in your evident "win" of an argument on the internet where you didn't say anything original or of merit. Later chummer.

>I'm making shit up and calling it fact
>but no, YOU are the ones shitposting
Fuck off and die.

...

Now we need retarded frogposter and my Vega thread bingo will be finally complete.

...

>Nvidia being nice to the AMD CPU side which received Jim Keller's blessing
>Intel warming up to AMD's GPU side which had Raja himself running away from it (at least till the end of the year)
At least one of these companies made a bad decision.

This is weird. I don't like this. It's like a glass of milk with whip cream.

>people are trying to have discussions in a >xxx on suicide watch meme thread
come on guys? neo-Sup Forums isn't the Sup Forums of old. it's not even Sup Forums of 2007. learn to avoid troll threads.

open the thread and add any frog pictures to your filters that you didn't have already. then report, hide the thread and move on.

>Making up claims out of whole cloth with zero evidentiary support and stating them as fact is trying to have a conversation.
(You)

considering when vega isn't primitive bottlenecked, see the two racing games now that amd dominates at, the 56 is giving the 1080ti a run for its money

hard to say the gpu division fucked up when the drivers need finalized silicon and then need a near 100% rewrite for everything amd added to it, with primitive and time based rasterization if I remember right as the two main ones, both of which are obviously not working right now.