With the way AMD has been gimping their cards recently, how long until the 480 starts beating the fury x?

With the way AMD has been gimping their cards recently, how long until the 480 starts beating the fury x?

Other urls found in this thread:

computerbase.de/2016-06/nvidia-geforce-gtx-1070-test/4/#diagramm-performancerating-2560-1440
computerbase.de/2016-05/grafikkarten-testsystem-2016/2/#abschnitt_die_benutzten_grafikkarten_und_die_taktraten
amd.com/en-us/products/graphics/desktop/r9
m.reddit.com/r/nvidia/comments/4bqpyq/the_division_gtx_970_overclocked_vs_amd_fury_x/
m.reddit.com/r/nvidia/comments/4bqpyq/the_divisi>on_gtx_970_overclocked_vs_amd_fury_x/
twitter.com/AnonBabble

Don't know but I've been noticing this too. The 390 performance has tanked noticeably bad.

When Vulkan and dx12 are in every game. Already happened in AAA

Not sure what I should be looking at here.

The 480 was made for the mass market, so they targeted 1080p monitors, and this is where the 480 performs best at. also notice the minimum frame rate, that's more important than maximum. something that the 390a dn 390x hover over the 480

Now looking at the fury line, they were fucked over hard due to process, they rely on vulcan and dx12 to preform as good as they should have been.

but looking at this chart, what in the actual fuck is happening at the high end? pretty much everything from fury to 1070 is the same fps... on dx 11 that shouldn't happen. lets take amd out of this why are 3 different cards almost matching each other on nvidia's side?

something is going on under the hood in this game

It's true, my 7950 is performing like shit lately but I've heard so many gimping memes I'm afraid of which brand to go to.

It's not so much the framerate, but just the card stuttering every few seconds.

I don't get it. Isn't the 980 ti/Titan x supposed to perform just barely worse than a 1070?

Fury x is 980 ti competitor so that's kinda to be expected but I'm curious about frametimes too.

I'm just bummed out about the horrible stuttering that's been happening on my 7950 for the last 2 months. I tried changing thermal paste and increasing core voltage+ power target, but it's still happening...

>cherrypicking benchmarks
>lying on the internet
390 is almost on par with a 980 these days. That's a $200 more expensive GPU.

That's pretty amazing considering it's just a warmed up R9 290 that originally launched in 2013 to compete with the GTX 780

>comparing it to a stock 980

Now aren't you retarded.

The 390 has little to no OCing headroom at all because just like you said, it's a warmed up r9 290, aka overvolted and overclocked almost at its limits, whereas reference nvidia cards are WAY below their potential.

Take a reference gtx 970 for example, it's stock clock is something like 1000 core clock, whereas you'd be hard pressed to find any gtx 970 that easily hit 1500hz core clock, which already puts it in stock 980 territory.

The 390 is a stupid gpu.

hard pressed to find any gtx 970 that CANT* hit 1500hz

I fucked up that line a little.

>2016
>looking at stock benchmarks when 99.99999% of people buy aftermarket

Joker productions has already settled this and the 970 won.

>mfw 7870 still over 30 fps

I too can compare factory overclocked cards to reference clocked cards :^)

They delivered double the performance per watt through the power of gimpsies!

...

Not him, but it's unfair to compare 390 to 970 because there's no such thing as a reference 390.

Should be a 290.
And you can oc a 970 more than the 390, which isn't shown in that chart.

Funny how it's got an unfair bias to AMD, huh.

>muh stock vs aftermarket

Fresh from 2016:

computerbase.de/2016-06/nvidia-geforce-gtx-1070-test/4/#diagramm-performancerating-2560-1440

Games, GPUs and their clockspeed used:
computerbase.de/2016-05/grafikkarten-testsystem-2016/2/#abschnitt_die_benutzten_grafikkarten_und_die_taktraten

The only thing that holds up is the superior overclocking.

With a factory overclock
>970 within 1 frame of the 390
>980 outperforming the 390x
>strix 980 actually within 1fps of the fury

Thanks for proving that is an unfair and inaccurate piece of shit.

>970 within 1 frame of the 390
That's an aftermarket 970 barely matching the STOCK 390. Are you retarded?

>980 outperforming the 390x

Now you're just pretending. The stock version is losing

>strix 980 actually within 1fps of the fury

And that's the only point you have.

Holy fuck, Sup Forums is cancer nowadays.

There's no such thing as a stock 390 though.
AMD completely skipped on reference cards because they were rebranded 290s.

Why aren't you comparing the 970 to a 290, or a 980 to a 290x?

Also, a 970 can be overclocked to reach fury performance so there is literally no reason to buy that generation of rebrandeon housefires.

Lol take a look at gtx 680, 660ti and gtx 780

>cuckvidia

>There's no such thing as a stock 390 though.
Read my fucking links or kill yourself right away, you fucking fanboy faggot.

They're using stock speeds on the link I've provided. And before you're trying to bullshit again, here's a link on AMDs specifications: amd.com/en-us/products/graphics/desktop/r9

>Also, a 970 can be overclocked to reach fury >performance

I want to see a source that isn't hardocp with four out of five gameworks titles. And even in that one it doesn't match a fury.

The clocks used are laid out clearly in the link. The AMD 390 is ran at 390 stock clocks.
The MSI 390 wasn't even run at its highest factory setting, which is 1060Mhz. If they had done that it would have edged out the stock 980 and probably even the stock 390x. No 970 can compete with that.

Check it. 970 getting within spitting distance of your precious fury x.

m.reddit.com/r/nvidia/comments/4bqpyq/the_division_gtx_970_overclocked_vs_amd_fury_x/

FUN FACT: the msi 390x is the highest clocked hawaii based card you can buy - its clocked higher than the 290x lightning and 290x vapor-x. Equally the msi 390x has the highest power draw of any single die hawaii card due to frankly obscene voltage.

Check this, your 970 getting murdered.

>AMDumdum too stupid to compare 2 cards at stock clocks.
It's like your mentally incapable of understanding that the 390 is a factory overclocked 290.

I though no adult actually overclocked their GPU?

>m.reddit.com/r/nvidia/comments/4bqpyq/the_divisi>on_gtx_970_overclocked_vs_amd_fury_x/

Good job, now show me a source that isn't one specific, cherrypicked title.

I can play the same game with pic related.

No adult overclocks AMD GPUs, because they can't afford to burn their house down.

A 970 is using half the power of the 390 so it's fine.

>crysis 3
I too can Cherry pick outdated benchmarks.

OBSERVE!
As a 970 outperforms a 290x on mantle!
In an AMD sponsored game!
Check mate atheists.

>reddit
>m.reddit
>youtube as source
>Ubisoft game
>watercooled 970

>It's like your mentally incapable of understanding that the 390 is a factory overclocked 290.

The fact that you're grasping for straws this desperately shows me that you're out of arguments.

>A 970 is using half the power of the 390 so it's fine.

You could've provided a source for this, as this isn't even that far off. But again, you're just shitposting.

>I too can Cherry pick outdated benchmarks.

You're the one who fucking posted that Division benchmark.

>No adult overclocks AMD GPUs, because they can't afford to burn their house down.

Neck yourself you fucking fanboy faggot. You're the reason why GPU threads on Sup Forums turn into shit.

>incapable of doing a quick search through any popular search engine
Absolutely pathetic.

If you're trying to prove a point, then link fucking sources.
I didn't even disagree with your statement regarding its power consumption.

But again, you're not in here for a discussion about GPU performance, but for defending your favorite brand in a shitposting manner. I got that.

>itt: retarded amd fanboys

Just watch joker productions 2016 bf1 video. The 970 smokes the sapphire 390 by 7 fps.

t. I suck Nvidia's dick over one benchmark

Kill yourself

>Linking an average of 26 titles across the board benched in 2016 with clock speeds on every card declared
>Somehow two cherrypicked games from some random """tech""" youtubers not even declaring their setups properly weights more and everyone disagreeing is just le AMD fanboy

This is what Sup Forums has become.

Sometimes I wish I could snap necks over the internet to cure the world of autists like you.

>heavily amd optimised game
>stock benches claim 970 loses by 7 fps
>actual video recorded benches of aib cards show 970 wins by 7 fps
>i-It's all a lie and c-conspiracy!

>being this assblasted

right after it beats the 1070

>380 rekts 960
>290 rekts 970
>gimped
Glad I got rid of my 280x though

The 390 was a high res card.. The fuck are you using it for 1080p?

> (OP)
>It's true, my 7950 is performing like shit lately
>It's not so much the framerate, but just the card stuttering every few seconds

What games? Might need to reinstall Wangblowz or roll back drivers. My RX470 simply could not play Quamtum Break with a Vsync on.. Horrible stuttering. Had to turn off Vsync and max out graphic settings to keep framerate below 60

Joker is literally just that, a joker. Probably the worst performance "analyst" I've ever seen.. Which is why consumer whore nvidiots flock to him. The BF1 vid is especially retarded. You can clearly see that the effects are different in the runs

Depends on the card. Also it's so easy, why not?

>buy AMD card
>power bill doubles
>room becomes hot like the desert
>programs regularly just shit themselves because drivers are STILL fucking shit
>can't play like every second AAA game at release because there's major technical problems with AMD cards

>it's okay though because it'll perform 5% better in two years

2009 called and wants its memes back
I like how you scattered a few 2012 memes in there though

And exactly what part of my post isn't true nowadays? Ignoring the obvious hyperbole, AMD cards are currently less efficient than nVidias, get hotter, and there have been more games released this year that at release had problems on AMD hardware but ran fine on nVida cards than the other way round. Drivers are debatable, but considering the people who claim the drivers now are great said the same thing eight years ago (when the drivers definitely weren't fucking great) I'm really not willing to give AMD the benefit of the doubt here

>consumer whore nvidiots flock to him

he's pro amd and anti nvidia you spastic. watch any of his podcasts. he's always shilling amd and his current main pc build is an amd build with an rx 480 and 8370 which he said he'll upgrade to zen when it comes out.

I usually just lurk, but I felt compelled to point out how fucking retarded this thread is.
You fucking gayming babies need to get a real hobby and stop flooding this board with shitty Sup Forums memes

don't forget that they suck in the aftermarket space. these fanboys keep going on about stock benchmarks but don't even use a stock card themselves. aftermarket benchmarks speak for themselves and most aftermarket 970's btfo aftermarket 480s. the 1060 is leagues ahead in the aftermarket to the point it doesn't even compete with the 480 anymore but the fury nano. an aftermarket 780 ti is faster than a 480 too.

This.

I had to return my 7950 because it was a stuttering mess in bf4 when I bought it. Ended up going saving up a bit and getting the 4GB GTX 680 though and it's been great.

I think one reason why it looks like keppler hasn't aged well is the vram. I haven't seen any modern reviews with the higher vram models of the cards, so the 7950 has a 50% vram advantage.

Pretty sure I seen that listed as a fix in a recent driver release, user. Might want to update. At least check the patch notes.

When are AMD releasing their new GPUs? I read that it would be in December but it's already the end of November and no new AMD GPUs in sight.

Have they fucked up again?

RX480 is a 150W card, GTX 1060 is 120W
So you're just meming about heat/power with the current gen
Drivers are a bit meh, but since the games are all running on AMD consoles there are as many problems with NVIDIA drivers/perf now

I don't care which team he's on. He's clearly a moron and I've only seen nvidiots cite him

Fuck no, retard. Maxwell doesn't scale as well with an OC. If an RX480 is at 1340 (common) and the GTX 1060 is at 2000MHz the RX480 will still win in a lot of games.. a 15% OC on the 1060 clearly doesn't make up the difference in the OP's bench

490 is coming out soonish, could be a dual Polaris GPU, which would probably suck for anything but professional applications. If you're waiting wait for Vega, which will have all the PS4 Pro's tricks

no. the gap between the 480 and 1060 clocks here is way smaller than the ones you are claiming won't give a 1060 a win and the 1060 is still winning by ~7 fps in this heavily amd optimized title.

>If an RX480 is at 1340 (common) and the GTX 1060 is at 2000MHz the RX480 will still win

Tippest of koks

>RX480 is a 150W card, GTX 1060 is 120W
And they're about the same, performance wise, right? Which means the RX 480 uses fucking 25% more? You don't think that's actually pretty huge? Nevermind the 1070 also having 150W TDP and that card shitting all over your precious RX 480.

>Don't update drivers
Problem solved. You can all stop posting in this thread. Maybe even go outside instead of jerking it to anime figures.

>but since the games are all running on AMD consoles there are as many problems with NVIDIA drivers/perf now
That didn't seem to help Dishonored 2 or Doom much. (Granted, Dishonored 2 runs like shit on any system, it still runs worse on AMD though)
Can you even name me two AAA games released this year that had major problems on nVidia cards at launch? For the record, I don't think most of these problems are even AMD's fault, the thing is just I don't really care who's fault it is when I can't play a game at launch and it always seems to be AMD with the problems

The newer XFX GTR 480's use less power than a GTX 1060 due to better binning by GloFo. Just so you know.

Trips of truth

On a percentage basis it's notable, but in real terms 30W is nothing, especially since the difference will decrease if both cards are OC'd.

Vulcan Doom runs amazing on AMD cards and better on than OpenGL on NVIDIA cards

Quantum Break pops into my head, Mafia 3 is another

I'm talking about users OCing the binned chips. RX480 hits about 1340MHz and GTX1060 hits about 2000MHz

Game? Notice that all cards are well over 60FPS, so I really don't give a shit. Use a modern example where extra perf is actually useful next time

Hory sheet! Checkum.

Literally every 1060 oc video I've watched has been able to oc past 2ghz to 2.05 or 2.1 like . The game is gta 5 which requires a lot of extra performance especially in grass or in the city.

>i-It's over 60 so I don't care about being 20 fps behind for the same price

Like clockwork

>QB

>mafia 3
>those amd frametimes

>It's over 60 so I don't care about being 20 fps behind for the same price
Exactly, you only need the extra perf on new games, which are AMD territory. Telling me that NVIDIA performs better on old, easy-to-run-on-any-vendor's-card game is pointless

Have fun with your 2012 games. GTAV runs better on NVIDIA, everyone knows this

Quit posting cherrypicked screenshots, shithead. Post actual benchmarks with averages and minimums

>Even six months after its Windows 10 store release, our GTX 970 retest on Direct X 12 eventually resulted in a crash caused by an Nvidia driver recovery (notably after 30 minutes of play, where hitching rose in severity over time). Despite numerous driver and game updates since, Quantum Break still freezes for us on our GTX 970 on DX12
eurogamer

>it's another GPU thread
Sage and hide, fuck off back to

Ayyy

>graphics processing units aren't technology

Nvidiots BTFO

Whoops, cropped off the gpu model in that.

>talks about cherrypicking
>cherrypicks day one issues

>day one issues
>six months after release
It's like you can't even read

>day 1 dx11 version

it's like you were born with a single brain cell

Does anyone have proof or do they just "feel" it's worse. OPs graph shows the 390 where I expect it. More likely the 480 drivers have improved.

>day one DX11
>six months after the DX12 version
It's like you really haven't thought this through

>retest

can't tell if you're legit retarded or just trolling

my Nvidia® GeForce™ GTX 960 still barely hits 20% usage in titles like Owlboy at 720p, so I think I'm good to go.

390x here performance has gotten noticeably worse since i first got the card back in may

going to get a 490x when it comes out anyway

that's your computer in general, probably slogged down with crapware and thermally throttling because you cant be bothered to dust it.

I'm legit not even sure what you're talking about. Read the chain and see if you can post a coherent reply.

>gimping
The older cards perform slightly better than a year ago when compared to nvidia cards.
AMD just optimized the drivers for the RX480, as they were rather shit on release. (pretty common for AMD)

He hasn't because he can't

>7970 56fps
>680 44 fps

retard detected.

is it normal that every amd owner is this mentally challenged?

>inb4 b-but nvidia

So, I don't like upgrading. 480 or 1060? Which will last me longer?

>a;fjneroerggpsdcw
NVIDIA users everyone

any aib 1060. just look at how well the aib 780 ti's are holding up. it's faster than both a 1060 and 480 and almost fury levels of performance.

source: duderandom84

Look at the OPs bench and answer the question for yourself

nope its fine its not cpu bound at all

>Vega, which will have all the PS4 Pro's tricks

not true. Vega has 2x fp16 and the hardware work schedulers enabled but won't have the hardware object/poly id buffer tagging for cheaper/faster postprocessing.

Just bought an MSI factory oc'd 470 for $155, with a $20 rebate, and a copy of Hitman, which I'll flip on ebay for $25.

So essentially a $130 card that can OC up to 480 levels and max out every modern game at 1080p. Y-you mad??/

It was $174**** so $130 after rebate & losing the shitty game. Lellers

Newegg had the asus for 145 after rebate, and it came with hiotman

All of this checks as the average Nvidia cuckboi

>fucking youtubers as a source
I want Sup Forums to leave

Yes but MSI is best for overcocking

Im assuming this is all stock.

the 1070 is above a stock titan x above a stock 980ti by a sizeable margin

a normal oc 980ti beats out a 1070, even if the 1070 is oc it doesnt scale with frames and isnt ocable by as large a margin as a 980ti is.

the titan x is under a 1070 because of clock, but then if you oc it its above it even oc by a sizeable amount, even a 1080 is hard pressed to beat out a titan x

Its weird to me how everything at stock is basically the exact same effective frame rate there is no curve no matter how slight, everything is to close for margin of error to even come into play.

480 is already beating the 1060 in most stuff but why would you get a mid end card for longevity? Just wait for the 490 or get a 1070/1080

>tfw 760 still is alright too

I love how the Nividiots are trying to push this narrative since they've been fucked so hard with Kepler.

I swear this is just like what they did with the 290x because of Fermi even though it was the reference cooler and not Hawaii itself that was a housefire.

They're just nervous because the industry is finally starting to trend away from GoyWorks baked in DX11 advantages, not to mention the fact that PS4 and XBone refreshes basically ensure another few years of devs getting comfier with (admittedly weak) AMD hardware on consoles.

If you thought Kepler had poor longevity, Pascal is going to put it to shame.