F

F

Other urls found in this thread:

youtube.com/watch?v=3MghYhf-GhU
twitter.com/SFWRedditVideos

RIP in peace AMD

TUMBLING DOWN TUMBLING DOWN TUMBLING DOWN

that's their CPU division logo
you'd think Sup Forums knows their logos by now

>implying AMD is dead when they will be selling the 480, with 390x performance, for $200
>they also have the entire console market(including both the new PS4 and Nintendo NX) so developers will be far more comfortable with their architecture

Nvidiacucks and their shitty performance/$ $600 cards lmao

Not to mention how much smaller the Polaris 10 die is... so many more chips per wafer = more money.

DELETE THIS THREAD

Hey retard the Radeon Pro Duo has 16TFLOPS which is miles ahead of the cards they unveiled today and the final Polaris specs haven't been unveiled yet

The only one dying today is Tom

F

>Radeon Pro Duo
>4 GB x 4 GB
>$1500
>crossfire meme & shits in one card

topkek

>4 GB x 4 GB
>x
16 GB? lol

Except that there is still no support for DX12 in Fermi cards. Same shit with Vulkan, it will just never happen.

>implying it can use 8 GB HBM, which is only up to 4 GB in the video game.

Shit, I missed everything. What's the release for the 1070? What happened to Tom? Where the hell is the sticky?

>x
You implied its 16 GB total, I know it's 4 + 4 GB.

/autism

>video game

June 10 for 1070, May 27 for 1080.

>no Zen or anything new for years
>only refresh and hype
Yeah it works too.

it might get some basic support, enough to run the API, but no real performance gains.

>anime board

It would be good, but I don't think it will ever happen. Nvidia just wants to get rid of Fermi....

for good reason, like how AMD dropped the 6XXX series in all but legacy support.
those cards had issues and are frankly ancient

ADORED TV SHILLS ARE STILL HERE LMAO

F

I know, but the argument of the picture is that nvidia has better support for older cards, and it's not really true.

well i can't aford a pro duo and my 4gb 380 is fine


but the CPU is dead to me they should of released AM4 Mobos and bristol rigde in march but they didn't just AM3+ crap again.

so i was left with no choice but to go intel
Maybe next time i'll look into AMD if they are not dead by then lol.

>should of

What the fuck is that wafer thing? I've already made some google research, but everything is too technical and complicated.

Don't be mean, user. Can't you see he's a tripfag?

Chips are being mass produced and come in wafer. Then they get through processes to choose where they go.

>Good ones go to high end things
>Mediocre ones go to mid end things
>Bad ones but still usable go to lower end things
>Bad ones and unusable are pretty much parts

the gtx 1070 is fucking less than $500, and it's better than a titan, insane efficiency AMD is dead

Thanks, user. Fuck those articles, in 4 lines you made what they didn't in 10 pages.

>less than $500, and it's better than a titan
So is the R9 390.

nerds like to ramble on about details
I tend to do it a lot

F

But guys AMD is going to deliver a card that will outperform anything in the $300 dollar range r-right?

please guys

Many sources say that Polaris is not high-end card. It is just near fast as 980 ti, so next gen-amd card is already rekt by 1070.

So.... $250 card?

480 will likely be 300$ if they make a 480 and 480x at launch, maybe 250-300 and 300-350 msrp with third parties pricing it at whatever.

>yfw the 1080 is only 2x faster than the 980 in certain applications where the 980 was physically incapable of outperforming a significantly cheaper AMD card

Cap this. The 1080 will only be 15% faster than a 980 in actual video games. Putting it on par with a Fury X

Nvidicucks are being fed a lie.

looks like the Polaris card will be the 390's replacement, meaning we have yet to see anything about the 390X's replacement

No, more like ~$300. It may reach to $350.

Then how the fuck does it even compete with GTX 1070?

Game is lost for AMD.

R&D is not cheap. ;_;

a bit more.

wafers are silicon discs that are 350mm big (i think a newer process is 400mm not sure) and they print the chips on them

the bigger the chip, the less you can make per wafer and i estimate wafers at 25-50k weather they are good or not.

imperfections in the process fuck with yields and how many chips are viable.

look at amd and the 2/3/4/6 core chips, if a part is bad, they turn it off/deactivate it, sometimes cut the part off if its a viable solution.

this gives you varying product skus, and why you don't see massive chips early in a processes life.

this applies to everything flash/cpu/gpu wise

now, the smaller the chip, the less likely an imperfection will cause massive parts of the silicon to be rendered useless.

now speculation part,

due to the way dx12/vulcan work, people are assuming that we are going to move to smaller chips and multiple chips on one card as the norm soon, and amd is already going this route with the 32 core zen, its 2 16 cores bolted into one... is the word interposer? not sure, ones cpu socket.

soon we may see multiple cpu silicons on one socket the norm for cpus just for the sake of yields, intel did this with the core2quads i believe, and with how big gpus are getting, putting 4 smaller fully working ones on a die opposed to hoping for 1 big die not to be touched... it makes sense, though backwards compatibility on the gpu side is in question, we will likely know more about this at the ass end of next year.

It's a significantly smaller chip than the 1080/1070. That means better yields which in turn means better price/performance.
Probably 5-10% slower than a 1070 but 20-30% cheaper.

It's the fucking guy from 2 weeks ago who was denying he's an adored tv shill.

Didn't I tell you to go back to bed? I really think you should go back to bed.

>20-30% cheaper

that's good for college folks like me.

Can't blame them, that's cool as fuck

Thank you too, user. That was very simple and clarifying.

so much samfagging going on here

samefagging*

>390
>better than titan
You fucking wut m9

how much time do you expect him to be on the bed

I got 3 words for you, you little shit:
Jim motherfucking Keller

they never did sell x80 cards at that price they never will
just like nvidia can't sell their pseudo flagship for 400

Damn they went full apple this time: the fuck is founder's edition? why is better binned chips cost whole 150 more?

On the other hand first 2ghz card, kinda cool.

well, it's true in games, titan is garbage now they focused whole driver support for 980/970

oh and speaking of 2ghz

That's nice on it's own, but it means they didn't improve core IPC and just pushed maxwell to it's limits on new shrink.
Logic is this, if they could make 1.4ghz card with 90w tdp and same performance they most definitely would. But they have to push those watts to keep it competitive.

update your shill handbook, jim keller jumped off the sinking ship like a year ago lmao

the og titan is shit because it has more fp64 processors than fp32 processors,

>Logic is this, if they could make 1.4ghz card with 90w tdp and same performance they most definitely would. But they have to push those watts to keep it competitive.

higher clock speed =/= higher power consumption, if pascal is anything like maxwell nvidia will sell the chips with the lowest leakage to gaymers so that they can get strong OCs without voltage increase.

Keller is a freelance designer
he finished his job and left, as per the contract.
he's not going to stay there for 2 years with his thumb up his ass while they finish it.

nice damage control raja, keller stated that he left because AMD wasn't going in the direction he wanted and because samsung gave him a better offer.

he is in tesla now, at least gets your facts straight before lying

>being legitimately retarded and buying a brand new card the second it comes out

unless you are a millionaire you wait 6 months and grab something thats lower in price....

oh did I mention the drivers will be optimized by then becuase Nvidia isnt perpetually driver cucked like AMD?

> not knowing how Jim Keller works

Nvidia has made shit drivers the past 6 months

Thanks for giving me the opportunity to learn something today. Quite refreshing on this board.

>and amd is already going this route with the 32 core zen, its 2 16 cores bolted into one... is the word interposer? not sure, ones cpu socket.

no, the 32 core zen is it's own die manufactured specifically for a CERN supercomputer. the server opterons amd releases are going to be two 8 core dies on an interconnect.

if we get 32 core zen cpus at all it will be something equivalent to intel's xeon phi, probably with a beefy gpu and memory on the package.

im not the shill, but i do watch the guy, what he said makes sense, though we will know if its true in a year or so... as for the cpu, that's a cern leak, that is real.

280x was msrp 300$ when it came out, had that crypto coin shit for a good 4-6 months and it shot up to around 800$

its reasonable to assume the sku is in the 250-350$ range depending on what they make.

depending on the binning, it can either oc a fuck load more or something like that. if you have the clear power advantage, why would you not sell it for more unless you had other motives?

There literally is no point in supporting cards that old.
Who needs dx12 if they won't really run new games anyway?

>wafers are 350mm diameter
No, wafers used today are 300mm (11.8 inches)

>I estimate wafers at 25-50k
No, printed and assembled wafers typically cost 5-7k with the exception of fantastically new technologies.
In which case wafer cost might be 10k

>smaller dies, multi-die chips
I just don't see this happening as far as GPUs are concerned, everyone learned their lesson from 3dFX going the way of the dodo from doing exactly this.
AMD does not have the resources, time and manpower to implement such a design. It's difficult enough to tune drivers right now and it would be foolish to rely on game developers to properly implement complex support.

>higher clock speed =/= higher power consumption

>other motives
i'd forget the revenue margins and try to undercut nvidia as much as possible without losing money to gain back market share

>everyone learned their lesson from 3dFX going the way of the dodo from doing exactly this.
that's not why, they tried to cut ties with asus,evga,msi,gigabyte etc. by buying their own company that did that and of course when all their partners turned on them they went dodo way

Unlike you I was actually alive when these things happened.
The purchase of STB was only a sideshow to the long running main event.

Their chips were essentially the exact same from the Voodoo2 onwards to the slow-to-market, difficult Voodoo5
Failing performance, a reliance on their proprietary GLIDE API when everyone was moving to DirectX 8, wasting money on things that weren't their core business, and the ultimate cost-complexity of everything beyond the Voodoo3 3500 were the real demise.

Huh? I bought a 5" p-Si wafer for less than 30 bucks each. Ho could something that is twice the diameter cost 100000% more?

youtube.com/watch?v=3MghYhf-GhU

they lied then here, okay

Did you buy it on eBay as a wall ornament? Because then you've essentially bought garbage.

It's generally agreed upon that 3dfx's demise was non-competitive product
They did an AMD-meme-MOAR-CORES instead of making their processors more advanced.
They were late to market, and when they were in the market finally they were so noncompetitive it was a foolish waste to consider their product.

Anything 3dfx had died after Voodoo3
Again, STB was a sideshow.

Fuck no. I bought them for my degree project

Like so

>in a year or so

Are you high? His "masterplan" can't come into effect for at least 5 years from now.

ah
new nvidia cards on the horizon
time for the kids to do some amd bashing so they can justify wasting a thousand dollars of mommy's money on a new card when their SLI 980 ti's with the custom closed loop cooler is working just fine.

have fun guys