WHICH/ONE

WHICH/ONE
WHICH/ONE
WHICH/ONE

Other urls found in this thread:

extremetech.com/gaming/231291-gigabytes-new-mini-itx-gtx-1070-could-kill-off-the-radeon-nano
anandtech.com/show/10465/amd-releases-statement-on-radeon-rx-480-power-consumption
twitter.com/SFWRedditGifs

Radeon

Nvidia and Intel old fat men are just behind your door, you won't be so smug for long little girl.

Wait for the benchmarks faggot

Amada pleases Nvidia employees so AMD can get R&D funds!

why, 1060 is going to win and everyone knows it
its just a matter of OP being an impatient faggot or not

>192 bit bus
>driver support for a year only

Even if the GTX 1060 sounds good on paper right now, the RX 480 will be a better card in the long term. If you don't plan to upgrade often you get cucked by Nvidia since they abandon all video cards where as AMD will support older card with their drivers.

GTX 1060

AYYMD cards don't even support Feature Level 12_1 so they're not DX12 GPUs

bullshit: the post

I wish Nvidia cards had ATI simple looks.

It's true though, start comparing benchmarks over the past 3-4 years and you'll see that a lot of Nvidia's cards are slower than AMD cards from the same generation on games. Nvidia optimizes their shit for a while but as soon as some new architecture comes about they don't bother with the old cards.

referance coolers.

jesus christ, when will you fags learn?

and isnt the 1060 ref like close to 350$

>midrange shit

The 1060 looks more promising than the 480.

Can you wait a week or so for reviews on the 1060 to make a more educated decision?

AMD does that too. They just haven't had a new architecture for years.

bit bus
Why do people keep bringing this up? Is 8gb/s on a 256-bit bus somehow superior to 8gb/s on a 192-bit bus?

the one with the highest bit bus obviously

I wonder if the reason why AMD bothers with old cards is because their last generation is full of rebrands from old cards. . .

extremetech.com/gaming/231291-gigabytes-new-mini-itx-gtx-1070-could-kill-off-the-radeon-nano

Neither.

Mini itx 1070.

THE LIFE OF 480
THE LIFE OF 480
THE LIFE OF 480
THE LIFE OF 480
THE LIFE OF 480

RX 480

...

C U T E

Fuck yes. I'd still buy the Nano on Firesale tho.

All I can see is the shield on that cooler struggling to keep the card within a two slot width.

Fat.

RX 480 for OpenGL-based rendering and compute stuff.

GTX 1060 for games and CUDA-based rendering and compute stuff.

So probably GTX 1060 for most people. The 4GB 480 will have better perf/$ but much worse efficiency/thermals/overclock potential.

>Is 8gb/s on a 256-bit bus somehow superior to 8gb/s on a 192-bit bus?

Not in actual benchmarks, but 256 is a bigger number than 192 so it helps assuage AMD buyers' remorse.

I FEEL LIKE RAJA

Kind of like AMD's "unprecedented" 8gb/s on the 480 which is the same as the 8gbs on the 1070?

AMD

Yes. 8Gb/s is the RAM speed. More channels = more aggregate speed. 192bit is three 64 bit channels. 256 bit is four 64 bit channels.

"unprecedented" meant more like 8GB for $199

Can confirm
my GTX 560ti
Still gets updated

Gotta grasp for a straw. I think that AMD might be able to make a better card if they used a smaller MC. That saves a lot of power. For a card at the level of 480, 6GB of memory is good enough and losing 2x8gbit chips and the MCs is a pretty good deal for power consumption.

I'll be picking up a 1060, personally. The lack of proprietary AMD drivers for Linux and the open source ones being FPS locked to the refresh rate of the monitor doesn't fit well with my needs.

Would upgrading from a GTX 580 to either be a big difference in practice?

>"unprecedented" meant more like 8GB for $199
No. It was precisely for 8gb/s.

>As you know, we continuously tune our GPUs in order to maximize their performance within their given power envelopes and the speed of the memory interface, which in this case is an unprecedented 8Gbps for GDDR5.
anandtech.com/show/10465/amd-releases-statement-on-radeon-rx-480-power-consumption

Because each *chip* is 8gb/s

The bus width dictates how many of those chips can communicate with the GPU at *once*

For some games/ scenarios, it can make a difference.