Stuck with an aging Fury X that can barely keep up with modern 2017 games

>stuck with an aging Fury X that can barely keep up with modern 2017 games
>see amazing 1080TI benchmarks
>cry and continue waiting for VEGA, which might be able to keep up with a 980 TI this time

Is buyer's remorse a common feeling with AMD products?

Other urls found in this thread:

youtube.com/watch?v=_ugW_iwVfZo
twitter.com/SFWRedditVideos

>fury
>aging
>4K meme
baka lad

As far I know FuryX is still powerful as 1080.
Are you one of those who think running games under 100fps is bad?

>As far I know FuryX is still powerful as 1080.
Are you fucking kidding me, it's 1070 level in best case scenario

Fine but you still have a GPU with high performance.
I would wait for nvidia/amd next gen.

I literally haven't run into anything i cant fully max out at 1440p at less that like 80fps on my crossfire 290xs

AYYMD REBRANDEON HOUSEFIRES

>Waiting over 1 year for KEKGA

Why would you wait for next gen?

There are plenty of games that don't work with CF

You are acting as if you had different feelings if you had Titan X instead of Fury X, GTX 1xxx is just that much better.

wtf am I reading.jpg

some games still do more than fine with a single 290x, let alone crossfire. Heck, if you're at 1440p/60ghz and don't mind using High settings instead of ultra, a Rx 480 8gb or gtx 1060 6gb might be enough.

sure, for high settings it might still do

finally amd fags are waking up and smelling the coffee

you know, Titan X the direct competition of Fury X on release? before GTX980Ti came out?

Better GPU purchase timing.
Would last longer.

>DUBS DO NOT LIE

That's awfully amusing.
As my 7950 still runs new games on high at 40+fps.

That argument can always be applied, which would lead to an indefinite postponement of purchase.

It's retarded, there will always be new technology around the corner.

Actualy the 980ti was already available at the time the fury x was released, it was it's direct competitor

an early new gen usually ages a lot better than a late old gen as the new gen gets all the optimization focus.

Welp, I have a Nano that I got for peanuts in a clearance a year ago and it's still holding fine. Give it enough cooling in a well ventilated case and it will perform almost at Fury non-X levels.

It drives everything I throw at it at 1440p and more than 60-70 fps, which is really nice with a Freesync display, and the minimum framerates improved a lot since I paired it with an R5 1600. It will literally take ages before I retire this little buddy, your Fury and Fury X will still be fine for long.

In b4
>muh 8GB of VRAM

In most games, even the most VRAM munching ones, crazy high resolutions become a problem way before the VRAM does.

On the other hand, depreciation costs on new technology are always huge.

For example, I picked up a 1080 for 450 a few weeks ago, the guy bought it for 830 half a year earlier

>there will always be new technology around the corner.
That's exactly why, Q3 is right there.

>Actualy the 980ti was already available
Fuck you are right, I misremember that.

>It drives everything I throw at it at 1440p and more than 60-70 fps
You must be throwing old shit at it

You know AMD GPUs get driver optimizations for years unlike nvidias right?

If you have numbers to back that story up, it might hold.

Else you are just talking in memes

I don't mind scaling down one or two graphical features because I'm not anal about muh ultra settings. I've always had poorfag GPUs so I'm used to it, and in the end I got exactly what I paid for so ¯\_(ツ)_/¯

Just imagine how bad you'd feel if you had bought 980ti instead.

Well I can appreciate someone who buys last years newest tech. It offers great value.

I wouldn't overstate it's potential though, it's still a fine card however if you are willing to tune down some stuff.

Turn down tessellation to 16x or 8x in the driver settings and it's like you got a brand new arch in performance ;)

I'm just waiting for Nvidia to add that slider as well, otherwise I'm not moving from AMD

I'm sure he would have felt better

It all depends on the games you play, sure tomb rider and Project cars hate Fury X but others, don't much care.

flagship models always come at an unreasonable premium which is quickly depreciated. that's why I don't buy flagships.

Witcher 3 and Assasins creed all are also significantly higher on the Titan x

Thanks for the tip. I had it at "Radeon optimized" settings, wasn't that limited to 16x or something like that?

It's not just flagship models, in absolute numbers they have the highest depreciation costs, but all new cards have high relative depreciation costs.

970's and 980's can also now be picked up for like 35% of their original price

I think it depends on the game, but I'd just set it to 16x or 8x depending on how much I'm willing to go down in settings.

pic of card or nvidiot shill.

Fine.
My picture shows back then when 970 had more performance than 390, 10 series wasn't launched at the time.

970 and 390 now:
youtube.com/watch?v=_ugW_iwVfZo

Those cards are still neck a neck.

The thing is that AMD often launches with bad drivers (definately in the past, it has gotten better) so there is more room for improvement.

Anandtech bench doesn't retest old GPUs from what I know

the 290/390 was initially positioned agains the 780 though

Those are 2016 benchmarks

The 290 yes, the 390 no, it was positioned against the 970 and the 390x against the 980

970 is 780 tho.

Please stop

What? Lol no

It's a defective hardware version of 980.
I feel sorry for who bough that piece of shit.

Just like the 390 is a defective version of the 390x, that is standard procedure for GPU manufacturing on a wafer

When did you get the Fury X?

unless the 8GB come in handy there is no performance difference between the 290 and 390

I'm talking about how they fucked the memory bus in that process.

cutting parts of your memory interface of without telling anyone isn't however

There is, it's caused by higher clock frequencies and the 8 gb does come in handy in new titles.
You mean the vram

390 has high enough clocks to be closer to 290x in performance.

the custom models of the 290 clock just as high as the 390 though. People just remember the abomination that was the reference leafblower. Hawaii and Grenada are the same chip, not even a respin if i remember correctly. And the 28nm node was mature at that point. There are no significant differences between them apart from VRAM and name.

This. I feel bad for the people who bought it.

No the vram bus, vram is fine.

Last card I bought was an HD7850. Still using it so no, can't say that I share the feeling.

Just wow.

there was like a

the shitstorm and the buyers remorse fags defending fraud and jewish business practises when this shit surfaced was glorius though

the vram is 3,5gb instead of the advertised 4gb, I wouldn't say that's fine

but the reason for that is the gimped memory interface, not the vram itself. that's his point.

>970 below the 960
Yeah let's call this benchmark what it is, an abnormality

abnormal things tend to happen a lot more often to hardware with abnormal features

You guys won't believe the number of 970s being sold on sites like EBAY right now.

And that's wrong, 0,5gb of the vram is slow vram, it isn't fine

To be fair that's a worst case scenario for 970. RE wants to use as much VRAM as possible and even true 4GB cards are punished heavily on 1440p. Just look at the difference between the two rx 480 models, it's 22 fps even though they run at same clock speeds.

>>buyer's rem

> buy RX 470 8GB and a cheap fan 140mm casefan
> turns out casefan is out of stock
> they don't ship it for weeks
> mail them and ask them to ship the GPU and the fan later
> finally get the GPU
> one week later AMD announces 5XX series
> RX 570 will be cheaper they say
> get buyers remorse

but now that I see the RX 570 4GB costs more than I paid for the RX 470 8GB regardless of the supposed "reference price" from AMD .. not so much. My less-clocked less-power-hungry "RX 580" wasn't all that bad of a deal.

>would wait for nvidia/amd next gen.
Waiting for the "next gen" is always pointless. Time will keep moving forward in the future too. There will always be a "next gen" to wait for.

>As my 7950 still runs new games
I'm guessing you'll start running into vram limitations soon enough, specially if you try using that thing for any advanced OpenCL. Or not, I don't know what you use yours for. This was the sole reason I upgraded from my 7850, it's got 2GB RAM. That's it. I'd probably still do fine with a 7850 with 8GB.

>AMD: users forever waiting for an update, crying
>Nvidia: users having their 5 month old $500 cards declared legacy, crying

Oh the joys of not being a gaymur retard.

Except it doesn't happen often, that result is just an outlier that is in no way reprensentative of general performance on both cards.

...

...

The VRAM itself is fine. The memorybus is fucked, which is the reason why the last 512MB of VRAm is "slow".

yeah, what did they do? Glue one pice of DDR1 on it or what? The VRAM itself is fine, they use 8 times the same 512MB samsung chips. The problem is the disabled L2$ block, resulting in the card only being able to read/write to/from the 7 chips that are normally connected to the crossbar at 7/8ths of the advertised speed or to the one chip with gimped connection at 1/8th the advertised speed at the same time.

Well, I guess I was wrong then

it happens all the time, they found cases back in the days when this surfaced, now that they stopped optimizing for it since they don't sell it anymore it only gets worse.

I have only seen one graph, so I'm not convinced it happens all the time.

Especialy since I haven't seen it in happen in newer games where the 970 was also tested

say hello to 2015
pcgameshardware.de/screenshots/original/2015/02/Frametimes_1080p_GTX_970-pcgh.png

They eventually fixed it for that game with a driver update. But that that is necessary shows the main problem of the 970, it needs individual optimization to prevent important stuff from being shoved into the slow 0.5GB segment or it gets fucked.

>pcgameshardware.de/screenshots/original/2015/02/Frametimes_1080p_GTX_970-pcgh.png

That's still only 2 cases.

However, I will be the first to admit the stunt they pulled with the 970 was retarded.

They still got away with it mostly because Maxwell was such a fantastic architecture. Way ahead of the competition in terms of effiency. And most games at that time didn't exceed the vram limitation. Nowadays it's different ofcourse

>AYYMD REBRANDEON HOUSEFIRES
You sure got that right.

I am still using HD6970 to play CS1.6 only.

The very post you're pointing to shows it getting over 60fps at 1440p in all those games except for two And those two are close enough that they'd probably do 60 at "high" settings.

You're retarded and can't see the difference between 2560x1440 and 3840x2160.

>still
You realize those numbers are from day 1 tests and they aren't retested?
Oh, of course you don't realize that, because you're fucking retarded.

Here's an actual new test.
I hate you retards that have no idea what you're even posting but you post it anyway.

Man I'm running a 280x and 2500k and I'm still averaging 60fps on shit from 2016.

How the fuck are people lacking performance?

no idea. Got a xeon e3 1230v3 and a R9 290 and it doesn't look like i'll have to upgrade any time soon

Yep I'm on a 7970 and 2500k as well.
It's my CPU holding me back more than the GPU on newer games. I can turn some graphics settings down if needed to run anything fine on the graphics side, but can't stop the CPU from causing hitches.

290x and 5820k here, managing 4k just fine

would still be on 2500k if it weren't for the mobo crapping out on me but so far it hasn't made that much of a difference

>stuck with an aging Fury X that can barely keep up with modern 2017 games
>can barely keep up with modern 2017 games
>can barely keep up
>with modern 2017 games
now that's how i know you don't own one, my 290x does 60fps 1440p max settings minus some aa in any game i played from 2016/2017, literally no reason to upgrade, fuck off pajeet shill

It won't at stock clocks.

>L2Dollar
is it normal to say that these days?

If you actually look at gaming benches the 290 and 390 are always within 2 fps of each other and the same goes for 290x and 390x.

$ is often used instead of cache

This is how I know you are full of shit, pajeet

I'm still using an r9 290 from 2013 at 1440p and playing games at good framerates

what is even the point of better graphic cards?

I have a fucking r9 290 and FX 8350 and I got 60 FUCKING SILKY SMOOTH FPS at DOOM 2016 which is in my opinion the best looking game ever.

I'm still getting Vega but it seems quite pointless

Vega won't be any worse than a 1080ti.

If i didn't have a Freesync monitor i would of upgraded my 390x to a 1080ti by now.

It will probably be released in a month anyways

If you get that 8350 to 4.7ghz and crank DOOM to ultra (but not nightmare) graphics settings at 1080p you won't see 60fps ever - the game will run at around 100fps for a good chunmk of the game dropping to high 70's for the intensive scenes when using vulkan.