Wasn't the RX 480 supposed to compete with the GTX 980? How can it when it can't even beat the GTX 970?

Wasn't the RX 480 supposed to compete with the GTX 980? How can it when it can't even beat the GTX 970?

Other urls found in this thread:

youtube.com/watch?v=ozATbV11rBA
tomshardware.com/reviews/amd-radeon-rx-480-polaris-10,4616-9.html
twitter.com/SFWRedditGifs

2×RX 480 SLI is supposed to compete with the GTX 980. Next time wait before you are less retarded before making the 50th graphics card thread on Sup Forums.

380 higher than 380x nice graph

>that little unicode x
literal AMD pajeet damage control confirmed

Drivers will improve pretty much everything you dumb n/v/idiot

Well maymayed

[spoiler]$0.02 has been deposited to your account[/spoiler]

it beats gtx980 in few titles...
people expect quite a lot from $200 card

I also dislike cries about power consumption
amd was not stepping from gtx970, they were stepping from r9 390, thats what should be reference for improvement

In a goyworks game. Might as well throw fallout 4 up there too


AMD already said no new drivers immediately

>spoiler
>le orc man

Did you get lost on your way to Sup Forums?

Ok, wtf are people complaining about?

It's funny how it's only acceptable when nvidia does this.

>Nvidia goyimworks game gets better performance on nvidia card. "AMD SUCKS!"

>AMD dx12meme gets better performance on amd card "DX12 DOESN'T MATTER!"

Nvidiots are in advanced damage control mode until 1060 launches.
We must prevent the 480 from selling!

you might want to studdy up on whats dx12 and what is gameworks

Did you guys see this video!
RX480 vs 970 frames

youtube.com/watch?v=ozATbV11rBA

more optimized drivers coming.


the gtx 970 has to be super OCed to compete with the stock RX 480, what a joke

not to forget that gtx 970 has mature drivers

But hey, at least it's got eight jiggerbytes, right goys?!?

Literally only hitman and AOS favor AMD in benchmarks.

TL;DR?

>Literally

Thanks for providing my point.

>m-muh DIRECTX 11 TITLES

nvidiots genocide when?

>Literally

mass suicide from NVdibots today

i have an Nvdia card so i'm no biased

Performance is fine, retarded leaks and bad marketing left many salty as they were expecting more.

All DX12 games (all three of them, hopefully more soon though right?)

>bad marketing
Really? Do you think it harmed AMD?
Lots of units will be sitting in the warehouses not being sold?

>Do you think it harmed AMD?
No. OEMs will probably be wanking all over this card.

>Guru3D on purpose did not bench it against a 390
>in the conclusion they admit it has similar performance and price in a short sentence buried in a wall of text

And you shills say Nvidia pays for their reviews.
Sheeesh.

nvidiots wrecked

2/3

3/3
I am loving these nvidiot tears.

Bonus round

>mature drivers
Something an AMD user will never experience. :^)

So basically the card is shit if I don't run Windows 10?

Do you understand that 480 is frying your mobo if you buy the official one now?

Nice meme

DELETE THIS

The fuck are you talking about, shill? Don't you have a housefire(or housefires, in case you were dumb enough to do SLI) to put out?

it is almost as fast as the 390 and faster than a 970. which is pretty good considering the price.

It pulls way too much for a 6pin+pci-e gpu

you forgot the OC review from pcgameshardware were they showed their 480 oced to 1.3ghz wreking shit and they compared it to a 980 classified which cost 630€+ btw in germony the 480 is 219€ and 269€ which is totaly ok for me

Absolute FUD. Several GTX 750tis and 960s pulled more power over the slot than the RX 480 does.
Why don't you go wipe off your asshurt fanboy tears and stop posting.

but it goes over the specifications
and still pulls too much for it no matter if some shittier gpus did too

>REEE!!!! IT ACTUALLY PERFORMS AS WELL AS PEOPLE EXPECTED IT TO!!!
>Oh I know! I'm just going to start acting like people were expecting the reference board to outperform on par with the GTX 980!!
Nice attempt at a straw man, but when all of you n/v/idiots try to damage control by posting thread after thready trying to imply that people were expecting the reference boards to perform on par with a 980 it's pretty obvious what you're trying to do.

true that m8 the reference cards are trash, there is a reason the AiB cards have 6+8pin powerconnectors but to be honest people who buy reference cards be it amd or nvidia reference are retards

>Crossfire 2 480s
>Motherboard explodes from the power draw
>Literal housefire

The GTX 960 has a single 6-pin connector, and pulls 60W more than the RX 480. I suggest you spread your FUD over at Sup Forums where they'll believe anything shills say and buy the novidya card like a good goy.

Pic related, RX 480 absolutely kicking nvidiot's shit in

>marketed as VR card
>don't provide VR sites with review samples

Will the 480 be loud? I'm looking to upgrade (and on my Czech salary it's tough...) but I want a card that has good performance and doesn't sound like a lawnmower. I have a GTX 580 with a stock cooler right now so hopefully it's quieter than this thing...

If I was you I'd get one with a third party cooler.

there was a post on reddit by the amd rep saying it passed pci-sig's internal testing for power compliance (not just amd's) in-house testing.
I suspect toms hardware has a flawed test.

If i wanted 970 performance i would have upgraded 2years ago. I see literally no reason to upgrade from 670gtx, since im still on 1080p and i am not a spreag and can play medium/high at 60fps, pretty much any title.
Once i get 4k display with hdr and atleast 75hz, then il look into upgrading my card.

Apparently the stock cooler is pretty quiet. I however don't trust single fan solutions so I'd wait for aftermarket coolers or non-reference boards to come out before buying one if I didn't already own a GTX 970.

>14nm node
>comparing to 28nm node that's no longer being made
>uses more power than a 1080 for 80% less performance
>hahaha take that nvidiots!!!!

jesus christ this is just sad

Day one, reference RX 480 slightly better than 970.

In two weeks, AIB models with >15% will surpass clearly the 980.

In one year, RX 480 will be on par to 980ti, mark my words.

RX 480 ref design (single 6-pin) draws more power then its recommed from the 6-pin and from pci-e slot, meaning it can fry your mobo or kill your psu (though its unlikely it will happen, it still can happen). Card is actually much less efficient than 1070/1080, so if you are planning to buy it wait for custom pcb version with double 6-pin or 8-pin, these will perform and OC much better and give the card 200-225W it actually needs.

>15% overclock

They don't just stop using a node as soon as a new node becomes available... They're obviously still going to continue making GTX 960's until the 1060 comes out at some point in the autumn.

it passed pci compliance. toms is fucking garbage

try july, and the fact you're comparing 2 years old tech to this garbage is pretty telling

How are the temps on it? Are they housefire tier or decent?

it's not nvidia's fault that shit manufacturer's like asus overclock/overvolt their cards and go over the power spec
stop trying pathetic amdshill

>14nm node
>comparing to 28nm node that's no longer being made
The only thing that matters is price. Absolutely no fucking one gives a shit about which node AMD or Nvidia is using, as long as they get decent performance and power consumption at a low price.
>uses more power than a 1080 for 80% less performance
Please refer to and , shill. You're just embarrassing yourself.

The only one sad is you now that your cover has been blown. No one can be as stupid as you, so the only probability is that you're a nvidiot shill.

1060 is competitor for the 970/980, 960 might continue be produced long after rebranded like 1050/1040(while the actual pascal variants will be 1050ti/1040ti, released at like mid 2017)

yet you keep posting there graphs, so make up your mind either toms is garbage or not.

It's not novidya's fault that a _reference_ GTX 960 pulls 80W more than a RX 480 despite both cards having a single 6-pin connector either, isn't it shill?

It only pulls dangerous loads under furmark, so you really should worry as it's pretty common that cards pull more than what the spec suggests. The GTX 970 for example will at peak times pull close to 300W when it's PCIe and 8 pin power should limit it to 225W.

So stop acting as if this is some kind of motherboard and PSU killer as pretty much GPUs exceed their maximum allowed specs at peak times or under heavy stress.

*60W, before a nvidiot starts hyperventilating and throws an autistic fit.

So are the rumors about it being extremely hot true?

>It's great guyze! You just can't consider this, or that or that one and oh, this other one and yeah, all those other games too!!!!!

Might as well save yourself the trouble and just watch a playthrough or do something else more productive with your time.

Yes, a 2 year old card that's not only being used by plenty of people, but also still being sold new to consumers.

Do you really think that Nvidia is going to let it's own products compete with each other like that? When the GTX 960 comes out they're obviously going to discontinue at least the 970.

>furmark as a bench for power consumption
what a fucking retard

once they get some 3rd party coolers out and get real overclocks, it will be within striking range of the 980

Rx480 is literal shit. Can't beat 970 on Witcher 3 the best game optimized for AMD cards. Truly shameful.

>Do you really think that Nvidia is going to let it's own products compete with each other like that? When the GTX 960 comes out they're obviously going to discontinue at least the 970.
Sure i meant exactly that, once 970/980 supply dries up they gonna release the 1060. 960 will not directly compete with 1060, just like 970 isnt directly competing with 1070.

that's very clearly an error, gtx 960 has a tdp of 120W

How is FurMark invalid as a power consumption benchmark? Are you cherrypicking or do you have an actual argument? It shows how much a GPU can pull under stress conditions.

nice b8

Sure, in 3 years.

So how big of a loss is AMD taking with each card selling them for only $200?

Every GPU for the past decade downclocks running furmark

Well, here's the closest I could find to a standalone reference 960 benchmark(this model's power target is 10W higher than reference).

they need to buy for like 2 bilions dollarinos from GloFo, before they can buy there stuff from someone else and based on how bad GloFo performs i wouldnt be suprised they arent making any money at all.

>asus
what a surprise. they're a joke

Did OP suddenly become retarded and could only keep memory for 1 page at a time?

Uh, what? Are you actually retarded or just pretending? This is not a "wait until we get thermal throttling" test. They just fire up FurMark and measure for a small amount of time, probably less than a minute. That won't get you thermal throttling. The reviewers know that a throttled result is not accurate.

some user did the math and because amd´s yields are good because of small die(chip) sizes and a some what ok proccess they make around ~50$ from a card which is really good in that price section.
Nvidia for example has horrible yields on their 1080cards biggest reason here is because the die size of the 1080 is really really big which is the only reason they are faster, nvidia always goes the way of moar die size to beat amd because if you would have to compare them on the same die size, amd would win 10 out of 10 because their architecture is much better, so nvidia is forced to make bigger chips to win which crippels yield and makes the cards so expensive. im fine with that, but at some point the yields of bigger chips get so bad they are forced to make better architectures.

You don't use fucking furmark for measuring efficiency. Wow, a card with a power delivery designed for 150W consumes less than a card with a power delivery designed for more than that in furmark. What a fucking surprise. Fuck off to Sup Forums or something.

tomshardware.com/reviews/amd-radeon-rx-480-polaris-10,4616-9.html

Lol saving this to laugh at all the nvidia shills repeating whatever the team green PR department sent them this generation

this have been debunked, 1080 sold much more than 980/980ti in same timeframe. Also 1070/1080 diesize is 314, while 980/970 is 398. Meaning, they have better yields than the 900series..

>draws more power than a 970 under a typical gaming load

wew so much for that efficiency. so its both slower AND less efficient than a 2 year old gimped architecture from nvidia. amd need to die and be bought up by samsung already. at least then we'll have some real competition in the market.

sold != delivered there was a shit ton of people who preordered their 10xx and still didn´t have them.
also you compare a fully matured 28nm proccess to a bad to medicore 16nm proccess with nearly the same die sizes? are you retarded

Not to mention a 1070 draws less power yet is an order of magnitude more powerful

The 1060 will be an interesting card when it launches

its more than 25% smaller, also you compared new amd process as well.. from extremely crappy GloFo who cant produce for shit is one of the main reasons AMD is in complete shithole (and will stay there, since they have to buy GloFo for atleast 2bilion usd before they can buy anything else)

I don't see the problem, shilly. Both in FurMark and gaming tests, the order is roughly the same(the most significant change in order is the 970 dropping 2 places, others are either a single shift or no change at all). Most important of all, no cards cross the 480 boundary, so the 480 stays at 3rd place, and the 480 itself only draws 4 watts more under FurMark. Your point is fucking retarded.

How long until the AIB's are due?

>draws more power than a 1070
>smaller die
>is much, much slower

what a steaming pile lol

1070 is like 200% more efficient kek.

>muh amd poo in loo
yeah ok boy....believe what ever floats your boat i dont care

Disgusting. How did Nvidia become more efficient going from 28nm to 16nm whereas AMD literally have the same efficiency as a 2 year old architecture, with their new architecure, going from 28nm to 14nm? Am I missing something here?