A graphic card that is $200+ more expensive (1070) gets like 10 or even less fps increase over a cheaper one (Radeon...

>A graphic card that is $200+ more expensive (1070) gets like 10 or even less fps increase over a cheaper one (Radeon 480)

How is this ok, ive been out the loop with graphic cards for a while and i comeback to this shit, what the fuck.

>GTX 970

The gimping is real

>The card of the price of the 970 gets 40+fps

State of this industry

I'm willing to be it's a memory thing because the 3.5 meme and running the game at 1440p. I highly doubt the game would perform better under older drivers.

How the fuck is 970 lower than 960?

>Consoles have a lot of fucking VRAM
>all games are optimized for consoles
>970 has less VRAM than 960
>conclusion, the 970 is fucked

Consoles don't have VRAM, they have 8 GB of shared memory.
4 GB vs 3.5 GB shouldn't make that much difference.

>4 GB vs 3.5 GB shouldn't make that much difference.
It does when the 970 has 3.5gb of normal vram, and .5gb of slow vram that causes games to slow down when using it. If it only had 3.5gb of normal vram and no other vram, most games would just load less assets into the vram and the framerate wouldn't suffer.

>rx 390 beating the rx 480

lol wut

What's surprising? It is technically more powerful on paper.

I thought nvidia patched it to not use the slow .5 GB ages ago.

>Game has fixed internal presets optimized for 2GB, 4GB, 8GB of memory
>Tries to cram 4GB of textures into a 3.5GB card
>lol 8 fps

>tfw still rocking R9 290

>gtx 970 owners
hawaii strong

Nope. They were managing WHAT goes into the slower portion via drivers, but as you know, Nvidia ditch their old cards immediately once new ones arrive, so now we're seeing what happens when games try to access that 0.5GB as if it were real VRAM.

They'd literally have been better off just disabling it via drivers, because it ruins performance and there's nothing you can do to stop it.

RX 480 seems like the best in terms of price/performance here. Any other recent games with benches like this?

>tfw my Fury gets >100 fps with nearly all settings maxed besides shadows turned to high
>more than enough rendering power
>shits the bed when entering a new cell because of high vram usage

This is suffering. The game will completely hang up when rounding corners and entering doors. I have to turn textures down to medium to prevent tapping out the memory capacity.

This game uses over 6gb of vram at 1080p with textures set to "very high."

All cards with 4gb or less vram are experiencing massive stuttering because of this.

Hawaii has basically twice the hardware of polaris - the fact that the 480 is so close shows how much more effecient polaris is to hawaii.

Hawaii is the godmachine of gpus.

*whilst using twice the power

I love my 290X, but I'd swap it for even a 4GB 480 in a heartbeat, even I'd lose a little performance on average.

>tfw 390X still pulling in the numbers.

I love this goddamn card.

>twice the power

NOT IN MY SYSTEM

(its closer to 3 times)

>tfw bought an r9 fury less than a year ago instead of a 390

What a fool I've been

Memory bandwidth.
390 has a 384 bit bus, 480 has 256 bit.
So the 390 will win in games where there is a lot of texture loading/streaming going on.

I just leave mine at stock voltage and 1050/1400 these days. Still performs like a champ and is much cooler and quieter than if I crank it.

>390 has a 384 bit bus

It actually has a mammoth 512-bit bus. A true brute force approach.

Whoops, yeah you're right.

What a beast.

Because it's a shitty benchmark. I have a 970 and I played through this heap of trash at 60 fps on 1080p but suddenly 1440p is gonna make it go between 8 and 25?

Besides, what people forget is that these shitty benchmarks are always using reference GPUs. I hope people realize that a stock 970 is sitting at a meager 1000 mhz, whereas you'll be hard pressed to find one that can't easily go above 1500 mhz. Meanwhile, the 480 that OP is shilling so much is already near its limits and can't OC worth a shit. Just check OCed 480 vs OCed 970. The 970 blows it out of the water.

you are at a higher resolution on a gpu that is not significantly more powerful then a 480

Here I know this isnt much but its the theroetical power of the cards.

stock 480 gflop stock (boost)
5161
(5834)

stock 1070 gflop stock (boost)
5783
(6463)

the gtr xfx comes in at
6165
and if you can clock it to 1475, and as this is binned, you can push it fairly high,
6796

granted most 1070's should hit 1900 so their gflop is
7296

Fact of the matter is, there was not this massive difference between the 1070 and 480 in power, just drivers, which are getting substantially better all the time.

Its not that the gimping is real, this is one of those games that demands a fuckload of vram, I believe at 1080p the game can demand 7gb, and on the 970 .5 of that 4gb is shit and will get used, its literally the only sub 4gb card on the list, see that 960 has 4 actual gb not a gimped .5

you can not patch a hardware fault, you could at best disable it, but that would be worse for them legally.

ps4 or one, forget which, has 8gb of gddr5, granted shared memory, but its still gddr5 they are sharing.

the 390' core is 2560:160:64 giving it 5367gflops
stock 480 is 2304:144:32 flops are above,

The 480 got fucked 2 ways here, one higher resolutions respond to rops, and the 480 is made for 1080p, and then you have the this is likely stock, so they never undervolted the card, boost is not maintained very well this way.

give me an 8gb one and I would sway a 290x our for it, headroom is king in my book.

Just picked up a pair of RX480 4GB's for $175 each.

Would have rather gone with a GTX 1060, but apparently there's a bug in the NVidia drivers with regards to VM pass-through, and I have to use ESXi or Hyper-V because work.

We'll see if the 480 does any better when they get here.

>1220MHz Core Clock on Hawaii
Tell me your secrets senpai!

>buying something other than a 1080

Seems my gpu is extremely leaky - it needs 1.4v to hit those clocks but it will run. Naturally I have equally monstrous cooling (air no less) to actually handle that.

But user 1080 is literally worst perf/$ right now.