>>58358162

Titan X (Pascal)
Fp62 1/32

Lol

Other urls found in this thread:

store.steampowered.com/hwsurvey/videocard/
kitguru.net/components/graphic-cards/anton-shilov/amd-we-are-actively-promoting-usage-of-hbm-and-do-not-collect-royalties/
twitter.com/NSFWRedditImage

>VEGA 20 is a 7nm

Wut

Clearly they're aiming Vega as a replacement for Hawaii in the DP market.

So consumercucks should just deal with shit DPFP since they're consumercucks

delett

Stop fucking up the narrative. We're hating on AMD now.

Im pretty hype for Vega

>pixelizing an obnoxious watermark and putting another obnoxious watermark over the top

>225W being a problem

user...

He lives in North Korea, haven't you seen night satellite images of that place?

The bigger the disparity, the better. Buttcoin miners get cucked and you get your GPU for cheaper.

If this leak is true then AMD is in deep trouble

Vega 10's target release is 1H2017. Most likely that means late Q2.
Based on what we know so far Vega 10 performs roughly on par with a GTX 1080. I know AMD fanboys will disagree but so far we've only seen cherry-picked results that are incredibly AMD-biased (doom Vulcan, AoTS) and some holes in the 'titan XP killer' narrative are emerging like the 37 FPS.

The moment Vega launches Nvidia will cut prices for the 1080 and 1070. At this point in time Nvidia has already enjoyed 1 entire year of record-setting 1080/1070 sales with a big fat profit margin because of the total lack of competition. Nvidia will lose some market share to Vega but not much because people are mostly interested in playing GameWorks titles (WD2, Division, and upcoming titles), and the price cut will keep them competitive.

Then, in Q4 2017 Nvidia launches Volta which will destroy Vega. AMD's only advantage now is HBM2 which Nvidia will adopt with Volta. Nvidia will price Volta VERY competitively in an attempt to wipe out AMD. Expect Titan XP performance at 970 prices.

So in 2017, AMD would have achieved only max 2 quarters of mediocre sales and seeing how they intend to reuse Vega until late 2018 the future is definitely not bright for Dr Su.

Oh hi Huang. Where is 1080ti you damned gpu gook?

If you're going to post erotic fanfic warn a nigga first. Ain't no one finna see your stroke material.

Second, Vega 10 is the small Vega, and even with one leg and both hands tied behind it's back it's fucking up a 1080.

Nvidia is shell shocked as no anouncement has been made yet about Vega so they can't issue a press release trying to steal AMDs thunder. It's a game of brinksmanship.

> Vega 10 is the small Vega
No, it's not. Similar to Polaris 10 and 11, 10 is a performance card.

>225W
>a lot
Nvidia's 10TF card uses 250W. Vega is 12TF for 225W
>7nm in 2018
as expected

According to the slide Vega 10 is 16GB which means its big Vega. Also, the 24TF FP16 corresponds to the $3000 MI25 deep learning flagship card that AMD announced some time ago. Obviously, AMD will use the biggest GPU they have in that product because they will likely generate far higher profits selling AI/DL cards to institutions than selling 4k gaymen cards in the niche high-end gaming market. So all in all Vega 10 is most definitely big Vega

>Posting simple shit for simple minds that are easily disproven

>Basing performance estimates on early demos

>Volta
>2017
not for consumers

>Expect Titan XP performance at 970 prices
kek

see

The 12.5 TFLOPS/512GB/s Vega variant being confirmed 225W is pretty bad news for Nvidia.

I'm curious if the x2 (1TB/s) variant is the same die or not (assuming is is) and whether the extra 75W is purely from the added memory or whether the GPU itself will be higher clocked.

Nvidia deliberately held off announcing 1080Ti because they wanted AMD to announce Vega first.

AMD didn't and now they're stuck in a place where everyone who wanted a 1080/1070 already have one, and the people who want something better would have to shell out another 1200$ to get a Titan XP, which 99.99% of consumers simply won't do.

If they launch the 1080Ti it has to slot in well above the 1080, and supplies will be limited at best. 900-1000$, more for Founder's Edition.

If they end up launching before AMD, and AMD has a big response to it for less than that, they lose.
If they end up launching before AMD, and AMD doesn't have a big response, well they get another few thousand buyers for ultra high end GPUs. Maybe people replacing AMD GPUs with theirs but most likely not.
If they wait out AMD launching to steal their thunder, they lose consumer attention. People are already pissed at them not talking about their GPUs basically at all at CES.

They aren't in a bad position, but AMD is far more primed to have a big launch than they do.

>mfw Polaris was Fury performance all along
>mfw desktop got the scraps

yeah but what's the wattage at that clock speed?

If it aint lower than Fury it doesn't matter. All AMD did in that case is make a cheaper chip, and while that's cool and all, people want cheaper, faster, and less power draw.

Less than 150W. I believe, I'll go look at the video once more. It'd be exciting if they would release these as RX 570 for the next series.

>everyone who wanted a 1080/1070 already have one
this is a dumb argument and while it makes sense superficially it is absolutely not corroborated by facts.

If you refer to the steam hardware survey
store.steampowered.com/hwsurvey/videocard/
the 1060/1070/1080 market share has been growing at an almost constant rate since release with NO slowing down.

also, it is notable that the Rx480 market share growth is absolutely pathetic and even the 1070 outsells it 10 to 1, notwithstanding the 1060.

Some people here find it difficult to believe that not everyone rushes to buy graphics cards the second they are launched but clearly this is the case.

Right here. He mentions something about a 140ish W peak.

Steam hardware survey is
1. Non-mandatory
2. Not wide-spread
3. Not done every month

Yes, it can give you %s for rough marketshare.
No, it can't give you actual %s for individual card marketshare.

>300w
Just to compete with the 220w gtx 1080?

Pathetic.

AYYYYYMD IS FINISHED AND BANKRUPT

this is nothing shocking 2bh. people expect there to me a massive userbase for the 480 because of youtube and neogaf shills but by just looking at the raw marketshare facts it's easy to see amd are being absolutely dominated by nvidia in every field.

>Can't read a powerpoint slide
pathetic

You raise valid points but none of those 3 limitations selectively exclude AMD or NVIDIA systems so they would not skew the results to either camp.

Another point is that the survey reveals how entrenched Nvidia's position is in the gaming market. Because AMD's market share is so low developers are reluctant to devote significant resources to optimise games for AMD cards uarch and this acts as positive feedback to further depress AMD's appeal to consumers. It's a vicious cycle

I've got the same card, but I haven't bothered overclocking it. On full load (unigine heaven, BF1, BF4, Witcher3) GPU-Z reports average power draw of the core to be around 90W.

The XFX card is highly underrated.

The GTR? Try some benchmarking software and see where it goes. Maybe you'll get lucky and have nice 1400+ MHz OC.

if the x2 variant an underclocked crossfire card, a minor variant of the V10 with two more HBM controllers bolted on, or something else?

>Vega x2
That's interesting I guess. For the enterprise I mean. MCM GPUs fucking when? I can't help but giggle at the thought of having 4 cute small dies like Polaris 10 on one card with no problems as a performance card.

If AMD can get dual Vega 10s into a 300 W envelope, a 150-175W Vega Nano or ridiculous stock Vega undervolts should be possible.

>MCM GPUs fucking when?

This is a hard problem already and gets worse the more renderer pipelines rely on multiple passes and previous frames' buffers. Besides load balancing challenges, you can't have shaders stalling continuously on pulling data from remote nodes, and preemptively broadcasting buffers gets expensive and can scale poorly. Binning/rasterizers could help, but there's likely a long way to go for this goal.

1080 is a "small pascal". The chip used by Titan XP is colossal compared to the 1080.

>navi 2019

Will it run modern games? Yes
Is it cheaper? Yes
Does it have nVidia botnet? No

I'm sold.

I am at this point of reasoning as well.

It's like buying self defense gun these days.
will it shoot? yes
will it jam? no
is it small? yes

sold.

and fuck your .45 faggotory

>1st half of 2017

My be on AMD strategy to grab market share this year no matter what.
They spent whole year more on this thing to undercut nvidia.
Price will be 350-400~=1080 performance. Nvidia won't lower it THAT low, they simply can't, die size is too big.
On CPU side same tactic, being jewed by intel for 7 years everyone expects ryzen to go for $800 just because it was like that for too long.
Intel can lower prices for 6900k, they make huge margins from that thing for no reason like USA IPS but nobody complains, can't for motherboards though.

It's going to be fun year.

Amd has a shit reputation everywhere, there is a lot of shills and retards who only like amd because its the underdog and less populsr, but the majority of people who buy the cards are not retarded hipster wannabees

>failing to understand the concept of a fingerprint watermark

>Amd has a shit reputation everywhere
By that you mean USA and britcucks?

It's not even middle sized PC market.

>225 watts

So anons were right. It's a fucking dual GPU monstrosity.

Actually it does. Steam is massive, like really fucking massive. Even if only 1% of users were retarded enough to accept the survey you still have a poll sample of 150 000 - 200 000 respondents - And that's only counting active users in last 48 hours. If a sample of 1000 is more than enough to tell who is going to become POTUS then a sample 200 times larger than that should be good enough to tell who is using what GPU. I hope you don't imply that AMD users are somehow special snowflakes that are too autistic or too smart to accept a survey.

AMD GPUs are DOA in whole Europe. They were somewhat popular in Eastern Europe when we all were ultra poorfags and could earn some cash by mining. Since mining is dead there is absolutely no reason to buy an AMD card in EU since it costs as much Nvidia counterparts while being hotter, louder and worse at the release.

>Second, Vega 10 is the small Vega
>guy in the doom movie clearly says 'flagship vega'

I was using a GTX560 for a long time, and every month I was asked to participate in hardware surveys. These last 2 years I've been using AMD graphic cards, and I only just received my first offer to participate in December.

I've read other people have experienced a similar thing, either going from nvidia to AMD and getting virtually no offers, or going from AMD to nvidia where they suddenly get lots of offers. It could just be coincidence, but it seems odd.

Yep. Can confirm. Pricing is absolute shite from AMD.

>Nvidia will price VERY competitively

>AMD's only advantage now is HBM2 which Nvidia will adopt with Volta
Accept that AMD holds the patent for HBM and HBM2.

>Then, in Q4 2017 Nvidia launches Volta which will destroy Vega. AMD's only advantage now is HBM2 which Nvidia will adopt with Volta. Nvidia will price Volta VERY competitively in an attempt to wipe out AMD. Expect Titan XP performance at 970 prices.

Not going to happen.
>kitguru.net/components/graphic-cards/anton-shilov/amd-we-are-actively-promoting-usage-of-hbm-and-do-not-collect-royalties/
>Nvidia Corp. has publicly revealed that its next-generation “Pascal” architecture of graphics processors supports second-generation HBM memory and in 2016 graphics cards with GP100 GPUs and HBM2 DRAM will hit the market. In fact, HBM support is a key feature of Nvidia’s “Pascal”, which will help the company to triple the bandwidth available to its next-gen GPUs, thus significantly improving their performance.
>Earlier this week a web-site reported that Nvidia will delay adoption of HBM because of royalties demanded by AMD for its HBM-related intellectual property.

So Volta will just be a Pascal refresh with HBM, and Nvidia will have to pay royalties to AMD for using HBM.

except* as in exception not accept like you're getting accepted into uni

...

No they don't you derp. They co-developed it but they have no plans on hoarding it exclusively.

Not that it matters. They didn't make it FOR gpu's, they made it for apu's and only put it in the fury as a big advertisement and proof of concept.

Zen/Vega apu's are going to be the big money maker for Zen in OEM products.

480 is cheaper here

people tend to read on stuff before buying anything, one of the prerogatives of being poor => you spend a lot of time researching before spending money

480 is no where to be found, and a ton of 1060 in stores

That's how it is here, too. When I went in to get my 480 I asked the sales rep what they're selling more of, and he said the 480. The 1060's and 1080's were mostly untouched, only the 1070 was selling in good numbers.