Explain to my peasant brain why did the R9 390 and the fury have such low clocks...

Explain to my peasant brain why did the R9 390 and the fury have such low clocks? was the memory supposed to save the entire card?

Because the architecture was already very hot and high in power consumption, so higher clockspeeds were not realistic

i know i used to have an XFX R9 390, sadly it died before i could sell it to miners for a shit ton of money

If you were to buy a board with super beefy VRM and watercool it, how much could you push it?

The old architecture isn't optimised for today tier high clocks. Hell, if you really look at things, it hasn't been until Maxwell that the average card could even gear near 1.5ghz

Not much at all. GCN was not made with high clocks in mind.

Gcn is the first to 1ghz. what happen

The low stock clock was kindof nice. If you could keep the card cool, you can overclock the shit out of them.

It's because AMD GPUs are made out of 20% poo.

because shit hits 90°C even at 1200mhz not to mention the power draw

> low clocks
GPU performance is clock*cores. Of course, there are other things, but if we're talking about "clocks", it applies to ALUs. So, they could either overclock it or add more cores; the latter is easier if a new silicone process is available. Besides, AMD ALUs performed much better than Nvidia till Pascal.

Yeah..no

It's just a reflection of GCN's architecture. The clockspeed of a GPU barely tells you anything about it's performance anyway. Hell, the clockspeed of any processor period barely tells you anything about it's performance in 2017. Anybody that says that it has anything to do with temperatures is really only understanding a small part of the story.

If clockspeed meant everything, then Prescott would have been the shit. Clockspeed can be manipulated by longer bipelines. Clockspeed tells you nothing about how much work the shaders in the GPU are actually doing.

Shader count and more importantly, shader performance, have really been the only two things that have ever mattered when it comes to judging graphics cards.

Yeah... Yeah? I had mine at 1400mhz stable, as long as you have some case side fans you're good to go.

You just lucked into having a good chip. Most GCN chips would be completely unstable at 1400 MHz regardless of cooling.

My 2 other friends with the same card hit 1400 and 1350... I don't think it's that uncommon.

>Gcn is the first to 1ghz. what happen

Geforce cards hit 2GHz so any previous accomplishment is now irrelevant and forgotten.

mine runs fine at 1500, stays under 60C full load :^)

Nvidia first to 3GHz

lol

why does my air fury seem to barely perform any better than the 6950 it replaced

your shit's fucked senpai

>using the smiley with a carat nose

>carat

>carrot

sent :^)

Uhm.... not 3ghz.... hello?

>2017
>still believing the GHz myth

hello i have this card

it gets hot and loud

what do i do now

help me

>using the smiley with a carat nose

help

>muh meba hurts

It's irrelevant. An RX 480 trades blows with a GTX 1060 despite giving up 700MHz+ in raw clock speed. An i7-5960X smashes an FX 9590 into orbit despite giving up nearly 1.5GHz+ in raw clock speed. Comparing clock speed between architectures is dumb.

The R9 390 is also a poor example of "low" clock speeds hurting anything too, given that Hawaii/Grenada still kicks the shit out of the current mid-range in some titles and is generally there or thereabouts across the board.

The Fury X's problems weren't clock speed, but architectural. Its biggest weakness was a mere 64 ROPs (the same amount found on a 290X/390X) trying to service those 4096 stream processors, creating a massive bottleneck. This was an inherent GCN architectural limitation.

Sell to miners, get 1070

Is vega is fiji but without the bottleneck?