Genuinely forces you to contemplate... hmmmm

Genuinely forces you to contemplate... hmmmm

Other urls found in this thread:

core0.staticworld.net/images/article/2015/07/strix-r9-390x-100596078-orig.png
youtube.com/watch?v=H9ZIcztgcNY
youtube.com/watch?v=sRo-1VFMcbc
youtube.com/watch?v=ASu3Xw6JM1w
images.anandtech.com/doci/10279/RadeonProDuoBareD_StraightOn_4c_5inch.jpg
techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/30.html
techpowerup.com/reviews/AMD/R9_290X/25.html
youtube.com/watch?v=hKEcLxqofIQ
youtube.com/watch?v=oNA6fll2DDQ
twitter.com/SFWRedditVideos

Genuinely forces me to contemplate why they compared it to a 950 instead of a 960 and how the hell they got a 950 to use 140w.

Oh wait. This is the same company that claimed this.

I still don't understand how this company has such a blind, rabid, slavering fanmob to defend their lies.

Love getting my money's worth when I spend a$440 on a graphics card

Genuinely forces me to contemplate if they are using the lowest power usage reading rather than an average or if they specifically chose "med present" so as not use its much higher average of 160+w.

Genuinely forces me to contemplate how anyone with half a brain can be so forgiving of AMD and yet so critical of Nvidia for lying.

Hello sirs this is an spomsored AMD forum. Please take discussion to Sup Forums.

Welcome to Sup Forums. You might want to talk to your supervisor on how to post here as you are clearly new.

Its Polaris 11
and those figures are total system power consumption

tech illiterate shitposter

Its average, they demoed it live. This is from back in January. Read the Anandtech write up.

>Its Polaris 11
Testing done in 12/2015 and it is not Polaris 10, which they are fixing to launch early in the next year. No. It's Polaris 11 which they haven't even gotten mocked up hardware for yet.

You sir, are the perfect example of an AMDrone making up any excuse to forgive AMD lies.

...

>has no idea what hes talking about
They showed off the die. It was Polaris 11, a 50w~ design that addresses entry level and mobile.

Tech illiterate sub human children need not post here.

DELETE THIS

install gentoo

>They showed off the die.
When did they show off the die? Months AFTER this interior lab testing?

>has no idea what he's talking about
Oh the irony!

>This is from back in January

>AMD Internal Lab testing as of Dec 2, 2015
You are claiming that they are even lying on when AMD did the testing?

AMD lying about everything

You guys do know they were talking about the what ever chis is used in the RX460?

chip*

>rx460
[citation needed]
I find it hard to believe they are advertising about any chip or gpu other than the one they launch after they present such an advertisement.

The demos were clearly faked. It was funny to see AMD marketing and shills meme about perf/watt too, especially since poolaris is equivalent perf/watt to maxwell despite poolaris being on the 14nm process and nearly half the size of gm204.

The same meme hype train will start up soon for vega if the rumor about the release being moved up to october is true.

going to have to look for the demo bench, but this was polaris 11 they were showing off, either the lowest end gpu they are making or a laptop part.

this ad is from december when amd announced poolaris, they explicitly said that this fake demo was polaris 11 (which is only being released as the 460 in dgpu form and the other 4 variants of poo 11 are being used for igpus)

>going to have to look for the demo bench
Don't let me keep you from providing a link. Or are you expecting someone else to prove your claim for you?

Calm your tits it's for the 460

They showed the die to people who actually matters.

>I find it hard to believe...

No one gives a fuck what you believe. The power draw proves it.

shows off a p11 slide
talks about p10

just another day on Sup Forums

>shows off a p11 slide
Still waiting for a citation when all I get is repetition.

>and those figures are total system power consumption
Yea I doubt it. 86W is way below a decent gaming PC. That's the kind of power consumption you got at idle with the thing running.

Interesting how one of the AMD's supposed demo cards had DVI despite the final product not having a DVI connector.

Even more interesting is that several non-reference 290x and 390x cards had similar configuration: core0.staticworld.net/images/article/2015/07/strix-r9-390x-100596078-orig.png

fury non-x also had the same configuration of ports. wonder why AMD would lie about something like this? what was there to gain by being deceitful?

did they ever said it will come with dvi?

holy delusional

There is a live demo of this using a Kill-A-Watt wall draw device telling you total system draw you fucking moron. The P10 variant with nearly twice the SPs naturally consumes more (~240W total system draw). The Polaris 10 cards weren't aimed towards this "medium settings 1080p60" area whatsoever.

youtube.com/watch?v=H9ZIcztgcNY

It's rather clear that AMD had the hardware way before the launch date, they likely had the silicon Q2 last year.

The card has a DVI header that board partners can attach DVI out to if they choose to drop the same PCB.

So the 950 was the total system usage not just the video card or the "architecture" as the ad misrepresents?

Hilarious how a budget card like the 480 doesn't even come with DVI, which is what most older monitors will use.

Interesting how one of the Nvidia's supposed demo cards had wood screws despite the final product not having any wood screws.

I was very upset about this in shop class.

>The card has a DVI header that board partners can attach DVI out to if they choose to drop the same PCB.

the point wasn't to complain about there being no dvi connector, but to show that the demo card was clearly not a genuine polaris 10 reference card. by the time they showed that polaris 10 demo (which was in april or may iirc) the PCB would have already been finalized and in production.

kek

>nvidia being deceitful and lying to customers excuses amd being deceitful and lying to customers

that's some great whataboutism there pajeet

>The Polaris 10 cards weren't aimed towards this "medium settings 1080p60" area whatsoever.
The RX 480 is a P10 card, yes? It is not aimed at the 1080p60 are whatsoever? Really? What is it aimed then?

Both company lies, but AMD never had fucking wood screws.

They actually did. The fury x2 that was originally shown when AMD announced the fury x and nano was a non-functional mockup. The final product was completely different.

AMD just has cards that kill motherboards, no biggie

Thats normal to have changes. Thats why its not final. Also what about when Nvida saw cut a PCB o a demo card? You can't find stuff like this with AMD.

Find me a video like this from AMD

youtube.com/watch?v=sRo-1VFMcbc

Find me a video where they cook an egg on a AMD

youtube.com/watch?v=ASu3Xw6JM1w

>Thats normal to have changes. Thats why its not final.

lol, not on the level the fury x2 went through. they changed the layout of the board completely and replaced many of the power delivery components with different ones, most likely because they were either glued on fakes or just weren't sufficient for the design in the first place. images.anandtech.com/doci/10279/RadeonProDuoBareD_StraightOn_4c_5inch.jpg

> Also what about when Nvida saw cut a PCB o a demo card? You can't find stuff like this with AMD.

more whataboutism, as i said, nvidia being deceitful doesn't excuse amd doing the same thing. you're also using one example from 6 years ago, amd was doing this shit as recently as 2 months ago with fake furyx2 pcb and the fake polaris 10 demo card.

samefag AMDummy getting so mad :^)

Sorry you buy inferior AMD products.

>most likely
Oh so you just making shit up?

>6 years ago
Same mother fuck runs the company now. They just got ass blasted so much they try not to do it again but guess what? 3.5 was not 6 years ago

Later mother fuckers!!!

yes not the fact that lisa stated that cooler master couldnt find a way to cool the vrms on the place they had them...
obviously you dont know nothing...

when ferrari asked for the evolution of 430 the 458 the initial draws was almost the same as the 430 because of the air ducts and cooling system... they had to literally redesing the whole car in order to fit the final design into the chasis

>unironically memeing about thermi in 2016

you realize that the 290/290x/390/390x all drew more power than the 480 and 580, right?

techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/30.html
techpowerup.com/reviews/AMD/R9_290X/25.html

kek, 290x draws more across the board

and remember the "N" stands for quality!

The "ad" or the Radeon update is somewhat misleading in that it is extremely opague; but it becomes elucidated by the live demo.
The PCB also had DVI pinout. AMD may have decided to change the board cooling solution and connectors last minute. Regardless, from looking at that picture, no one can know if that was a genuine P10 card. IIRC it was leaked by a journalist during a demo without context (I believe it was running Hitman).
It's aimed at entry level VR, as stated on slides. The 470 (also P10) is intended to max out games at 1080, and the 460 (P11) is designed for "esports" and 1080p gaming.

we call it thermi because it holds the record for being a fucking fusion reactor passing 100c at ease

I couldn't give less of an iota of a single shit how many Watts my _desktop_ GPU is drawing.

What did he mean by this?

>it becomes elucidated by the live demo.
You mean the youtube link where they people making the video had no information other than what AMD gave them, including not knowing what version of polaris was being demo'd?

>It's aimed at entry level VR, as stated on slides.
Perhaps I am remembering this wrong but I thought the 480, at launch, was advertised as a low cost 1080@60fps card or "designed for premium gaming beyond HD?"

>Laptop part
>86W
Yea no. That's a whole laptop worth of power on its own.

perf/watt is a big deal when you need to pack 8 gpus each into a few dozen machines for gpu mining or other gpgpu compute stuff. it's part of the reason why nvidia is so dominant in that market, amd has been an entire generation behind them in perf/watt for a few years now.

>up to

>all this damage control in light of the vulkan Doom update

i honestly find it hard to believe anyone could believe that was the 480
>destroys the 970
why would they be comparing it to the 950??

he was talking about 470 and 460 (470 being 2.8x over the previous gen of amd)

And that is the crux of how AMD blinds it fanmob. AMD is going to win because Vulkan has "potential" to replace DirectX. Devs have the "potential" to use async for multiple GPUs. The 480 has the "potential" to better than its Nvidia competition. So they'll just believe the speculation based upon AMD's potential and disregard any inconvenient evidence.

>Wait for Vega even though Polaris is turning out to be turd
>DX12 will favor AMD even though half of the current games on DX12 perform better on Nvidia cards.
And then there is the perennial
>They are using beta drivers. AMD cards will be better with new drivers by which time new cards are launched.

>why would they be comparing it to the 950??
That's one of the questions. Another would be why would they be advertising for a gpu that is not the one coming out subsequent to the advertisement?

>AMD is going to win because Vulkan has "potential" to replace DirectX
DX12 or Vulkan doesn't matter, AMD are the only ones with GPUS that support modern APIs

>And that is the crux of how AMD blinds it fanmob. AMD is going to win because Vulkan has "potential" to replace DirectX. Devs have the "potential" to use async for multiple GPUs. The 480 has the "potential" to better than its Nvidia competition. So they'll just believe the speculation based upon AMD's potential and disregard any inconvenient evidence.

it's funny how AMD and it's fanboys still can't learn this. devs will not do extra work to extract a 10% performance gain from hardware that 1% of it's customers use. the same thing happened with the terascale architectures, bulldozer and GCN. it won't change with the poolaris GCN rehash. AMD will have to do the heavy lifting themselves if they want their cards to sell.

A lack of information does not equate to them lying to you. The demo was the only official piece of information at the time, which was released in January during the RTG Summit. AMD was simply exhibiting that they had a technology that focused on power efficiency.
Raja mentioned the 480 as a 1440p capable card (benchmarks show that the 480 is at it's limits at that resolution, memory bus simply isn't capable), but the 470 is mentioned as the 1080p60 card (further that it would be able to max out games). The 480 is repeatedly hailed as an entry point to VR.
The reason nVidia is winning in GPGPU is not because of perf/watt; that factor is rather insignificant because compute kernels run faster per watt on GCN from design since GCN 1.0, it's just an extremely ALU dense design. GPGPU is extremely popular on nVidia because they pioneered high level GPGPU APIs through CUDA, and is, at least at the moment, far more mature than OpenCL, bearing association to the majority of GPU compute libraries.

since their target audience isn't 12 year olds incapable of comprehending something that is more complex than a 12 times table it doesn't matter if retarded children like you don't understand.

>A lack of information does not equate to them lying to you.
Misrepresentation is not just as deceitful as lying?

>Raja mentioned the 480 as a 1440p capable card
Capable=designed as its primary use? Thank you for proving that misrepresentation is just as deceitful as lying.

because it was the polaris 11 show not the polaris 10...
youtube.com/watch?v=hKEcLxqofIQ

>86 Watts
What's that power usage from the PCI-e lane alone?

no shit, i was pointing out the nvidiot's stupidity

Thank you for the classic example of an ad hominem argument.

Or perhaps you can explain what they meant in way that even a 12 year old can understand though I expect you'll stay to such offensive and vague defending of obvious bullshit from AMD.

get back to hanging off nvidia's 3.5mm long dick

Allegedly that's what the system draws from the wall outlet.

sorry user i have lost quite a lot of iq these days trying to argue with them

>offensive and vague defending of obvious bullshit from AMD
Yeah, that's what I thought.

youtube.com/watch?v=oNA6fll2DDQ
"allegedly"
no, that's exactly what it was.

>The reason nVidia is winning in GPGPU is not because of perf/watt; that factor is rather insignificant because compute kernels run faster per watt on GCN from design since GCN 1.0,

you're confusing perf/watt with raw performance. GCN has better performance clock for clock with an equivalent NVIDIA card in these applications, but it draws 2-3x the power to achieve that.

Did AMD provide the meters or were they verified to be functioning as they should be by independent reviewers?

>obvious bullshit
[citation needed]

that would be such an nvidia thing to do

Because AMD is above lying or misrepresentation? See pic in for an example.

Its not about how much power it draws, its about how HOT it got.

>being Australian

You never get your money's worth on AMD.

maybe throwing in 8gbps ram as opposed to something more standard was a last minute thing?

all these conclusions you're jumping to and all this shitposting you're doing, i think you deserve a raise from nvidia. they probably don't care about you enough for that though.

who cares about how hot a card gets, especially a reference card?

if you're not setting up a custom fan profile for any gpu you buy then you're doing it wrong. more often than not they are misconfigured out of the box and don't even ramp up past 20% until the card is in failure territory.

>who cares about how hot a card gets
nvidiots, everyone. justifying their housefires as usual.

I don't know, why don't you go ask all the rabid fanboys who think that 80c is hot for the RX480? or 90c for the 290?
I think you're forgetting that Fermi would easily reach 105 - 110c

>maybe throwing in 8gbps ram as opposed to something more standard was a last minute thing?
Even so, it was deceitful to claim it was "unprecedented" even if it was accidental. That's the same thing we put the 970's 3.5gbs to the coals for.

>Fermis
>tfw

if you let your GPU reach 80c, 90c or 105c in the first place then you're a mouthbreathing retard (which is why there are so many people complaining about high temps on AMD cards: stupid people naturally buy defective rebranded products)

any non-braindead person knows they should be setting up a fan profile with afterburner or speedfan to maintain temps under 70 or 60c.

>86w

Off the PCIE?

that's the entire system, the gpu is 50w tdp

>Misrepresentation is not just as deceitful as lying?
A lack of information does not equate misrepresentation.
>Capable=designed as its primary use? Thank you for proving that misrepresentation is just as deceitful as lying.
??? Yes? It was designed for "premium VR" according to slides. Of course this is very opague, and a lot of people overanalyze it to extrapolate performance because their lives revolve around computer hardware for whatever reason, but the common folk will have no such gripe.
I think that's rather tenuous. cgminer MH/s runs much, much faster per watt (450MH/s peak on my 7950 stock whereas CUDA mining on a 680 rakes in only 100MH/s). Of course, that depends on the task at hand, but any task that can be run concurrently at load with many threads will massively favor the design of the SIMD-V pipelines in GCN.

>A lack of information does not equate misrepresentation.
Presenting information in a manner that would be easily misunderstood by the common audience is misrepresentation.

Didn't we have a recent thread about the 1060's comparison with the 480 starting at .8 that was loudly proclaimed by AMD fanboys as similarly misrepresentative even if there was not a lack of information?

>Didn't we have a recent thread about the 1060's comparison with the 480 *using a bar graph starting at .8 that was loudly proclaimed by AMD fanboys as similarly misrepresentative even if there was not a lack of information?
Fixed.

Dual-Link DVI is what you need for 120 or 144Hz, Pajeet.