So now that based AMD has revealed the specs of the RX 480...

So now that based AMD has revealed the specs of the RX 480, what do you think the specs of the fully unlocked RX 480X will be? How well will it perform? How much more power will it consume, how much will it cost etc.?

Other urls found in this thread:

hardocp.com/article/2016/05/27/from_ati_to_amd_back_journey_in_futility/
pcper.com/news/Graphics-Cards/AMD-Confirms-Tonga-384-bit-Memory-Bus-Not-Enabled-Any-Products
youtube.com/watch?v=04ITA1_XoqM
hardforum.com/threads/from-ati-to-amd-back-to-ati-a-journey-in-futility-h.1900681/
twitter.com/AnonBabble

Performance: Still less than a 1070
Power consumption: Housefire
Cost: Your house

RX 480 is the '480X' you speak of.
There will be an R9 480 which is what you appear to think the RX 480 is.

Spot the nvidiot shill

Thanks Pajeet, was confused about that as well.

i m wondeeing if this card will be in new macs. i want to upgrade my hackintosh gpu

hardocp.com/article/2016/05/27/from_ati_to_amd_back_journey_in_futility/

>In the simplest terms AMD has created a product that runs hotter and slower than its competition's new architecture by a potentially significant margin.

This HOUSEFIRE is already at 150W TDP, more shaders enabled means over 150W TDP and still much slower than GTX 1070 150W TDP

You are buttdevestated.

>website was not invited to launch event
>starts ranting without any sources

Does it still hurt, Kyle?

Why is HardOCP so anti AMD?

At least they were more subtle before they were refused the nano...

...

>150W TDP
TDP != power consumtion

Protip: TDP gets measured be meauring the power consumption because measuring the actual heat is close to impossible.

No it an approximate figure for cooler designers. For instance, Intel CPUs usually use more power than their TDP specifies

According to videocardz Benchmark leaks. Performance is on Fury/ gtx 980 Level.
Pretty great for 200$ IMHO.

I doubt we're going to see a card with the full Polaris die available on shelves for quite a while because those are likely all being sent directly to OEMs.

>When you just bought nvidia again after their 970 Lie Fiasco. And now after the AMD 480, their upper mudrange Release, you are salty as fuck and try to justify your purchase to yourself.

-The Post

You just answered your question. Kyle threw a bitch fit because AMD wouldn't send him a free card that he could shitpost about on his clickbait website. He's been buttblasted ever since then.

For reference, look at how long it took for them to release a full Tonga chip to end users.

Isn't it still cut, just less?

The r9 380x is the full Tonga, although they renamed it Antigua for the 300 series.

Not him, but I always assumed tonga was just a cut down fury

Not even close, Fury uses Fiji which utilizes HBM. Tonga uses GDDR5.

Yep, R9 285's launch date was 09/02/14, while R9 380X's one was 11/19/15. Hopefully this won't be the case here.

Can they not simply replace the memory controller?
Lots of budget cards come with either ddr3 or gddr5.

That would entail a completely separate chip since the memory controller is built in.

Do we have an idea of the temperature ? will it be an housefire like a lot of people says here ?

RX480X is going to be exclusive for iMacs all 2016. The performance is on par with Fury non X (as by leaked benchmarks). RX480 is slightly below 980, and almost 390X performance wise.

So they make multiple versions of the same gpu?

My laptop gpu has 2 models, 2gb ddr3 or 1gb gddr5.

source?

but full tonga has broader memory controller than 380x

pcper.com/news/Graphics-Cards/AMD-Confirms-Tonga-384-bit-Memory-Bus-Not-Enabled-Any-Products

Makes sense that they did that, though.
Enabling the additional 128 bit would have increased power draw by about 20W (11%), but performance would have been boosted by less than 5%. Also they would have had to add another 2 GiB of memory, which would have increased the price.

In the end, they would have increased power draw and price of the card by over 10%, for at best 5% more performance. Even spending that 20W on the core would have been more efficient, since it actually is limited by the default power target.

Pretty sure that 36CU is the full die.

Power consumption for the 5K MacMonitor isn't bad though. It looks like the GPUs they sourced are an extremely high binning, or they have some power saving IP not enabled in consumer Tonga, closer to what is seen in Fury. It makes sense as they're both 3rd gen GCN but consumer R9 285 cards were surprisingly lacking.

rumored before 2560 CUs and probably 100-150mghz higher clock

Because Kyle got in argument with AMD over something stupid and now wont top shitting over because he didn't get his free samples in time.
What an autist

This would make the most sense.

>MGHz
That's one beefy ass VRM

I sincerely doubt GCN is pushing clocks that high, its not worth the increase in power.

Thanks for the info buddy, now neck yourself

How does the 480 do compared to a 970 overclocked to 1500 MHz? I already have the 970 but I want something with more than 3.5 GB and 1070 will most likely be 500€ or more.

It's a worthwhile upgrade if you've got the extra cash imo, and you can always get another later

Well a 480 in cf beats a 1080 for cheaper.

I'd stick with the 970 though unless it's not cutting it for whatever it is you do with it.

Who knows, there aren't many benchmarks available yet. It's most likely not quite as powerful as marketed except in some handpicked situations. They are just doing what 90% of companies do before launching a big title, hyping it up as the second coming of Christ. If the actual performance is anywhere near what's currently being advertised, I'm buying it though.

...

480 has 36 CU
480x will probably have 40 as opposed to 44, because otherwise the performance gap between it and Vega will be too small

Stock 480 should be at least 15-20% faster than a 970. (390x and 980 are about 17% faster)
A heavily overclocked 970 would come fairly close to a stock 480, but then you can still overclock the 480.

what's the the 59hz on hdmi post
What's the difference between 60 and 59?

Does that mean I only get 59 fps?

That's because whatever chip is in your card has a memory controller that is able to utilize ddr3 or gddr5. That's not that difficult to do since gddr5 is just an evolution of ddr3, which is an evolution of ddr2 which is an evolution of... and so on and so forth. Look at NV's 1080/1070. The 1070 is just a cut down 1080 that uses gddr5 instead of gddr5x. The memory controller is able to do that because gddr5 and gddr5x are almost identical from a structural standpoint. That's not the case for HBM. HBM is actually made up of stacks sitting on top of each other, which is totally different from all of the various iterations of ddr that are placed on a plane. It might be possible to make a mem controller that could address HBM as well as traditional ddr, but it would make absolutely no sense when you factor in the cost to develop and deploy it. Also, HBM sits in a completely different performance bracket, so it makes no sense to waste money planning out a die that would be used in such disparate market segments.

(yeah I know that I've oversimplified it to the point of being almost factually wrong)

gddr and ddr are totally different types of memory tough. Please don't mistake gddr3 with ddr3, they are different.

GDDR5 and DDR3 are pin compatible, though they handle signaling differently.
GDDR5 memory controllers can in fact use the same DDR3 DRAM used in desktop DIMMS.

Of course they're different, just like how a pajeet is different from a man. HBM, however, would be more like a silverback gorilla.

>tfw just bought a Geforce 960 for 200€
>together with a new monitor that requires a better GPU to even work
My timing is always glorious.

AMD wins again

Not in CF. Using the multi GPU technologies that DX12 has.

NVIDIA WON

That doesn't sound like a very smart way to name your GPU, since many people refer to card solely by the numbering scheme, which AMD is definitely aware of.

youtube.com/watch?v=04ITA1_XoqM

are we just going to overlook amd's marketing tactics?

Considering everyone has been aware that this was coming since last year, you're pretty fuckin' dumb bro.

What's it like to not do research before buying things?

being so buttblasted you post a video on youtube about it
holy kek

And they accuse him of being an Nvidia shill

beauty of mantle and every fork of it, is that it plays for AMD strengths and weaknesses.

This is why Async is such a big thing, because AMD always had it, while nVidia saw no benefits in it prior.

you can be sponsored by Intel, AMD and Microsoft, yet still be an nvidia shill.

I didn't know there were leaks of this GPU beforehand.

I think it's safe to say he shills for both Nvidia and AMD

I thought he was a corsair shill

He shills for anyone willing to pay

He's a Noctua fetishist.

doesnt that just make him a positive person

And its also significantly cheaper ;^)

> tonga and no 384b bus

ah yes, the time honored Radeon tradition of wasting die space on shit that never got used...

Neutral you mean

>Protip: TDP gets measured be meauring the power consumption
What? No! You knuckle dragging mouth breather, why do you post when you have no idea what you're talking about?

All the nivida shills are so butthurt, this shit is golden

>So now that based AMD has revealed the specs of the RX 480, what do you think the specs of the fully unlocked RX 480X will be?

RX 480 is already the uncut die. It's only 230 mm^2, so it won't need as much binning as the 312 mm^2 GP104/1070/1080.

>480x will probably have 40 as opposed to 44, because otherwise the performance gap between it and Vega will be too small

Do you honestly think AMD's going to build a HBM2/interposer using GPU that's not at least 50% larger than Polaris?

Each HMB2 module provides 250 GB/s of bandwidth, so you'd need at least two to make a chip worthwhile.

A 2x/3xHBM small and a 3x/4xHBM large Vega would both be well beyond what Polaris 10 could ever hope to offer.

This.
Also Nvidia shills on suicide watch:^)))))))

If it was the full chip they would call it 480x. That's almost certain

He looks like an east european gay porn star

> RX 480X
Fuck that. GIve me 490X.

It's new branding, genius.

> R9 3??[X]
> RX 480

Just like ATI went from the 9000 series to the X000 series ca. 2005, they probably wanted to bump R9, kept the X=10 thing, and decided that too many Xs in a name is confusing.

Tumblr memes belong on >>>/tumblr/

Very well priced given the performance it will offer.

Glad amd is putting some value on the line in GPUs. Hopefully it will force Nvidia to reduce their as well.

My 680 isn't dead yet but with vr becoming more common place I am looking at upgrade options.

>yfw RX 480 Pro

YOUR QUEEN.

>RX 480X
>Two X
>Twice as fast than GTX
>OPPA

Pajeet says it will run cool.

He shills for anyone who will pay him. Nvidia does it so amd is doing it too now.

>RX vs 1060 FUCKING WHEN

>Via 9gag.com
KYS

hardforum.com/threads/from-ati-to-amd-back-to-ati-a-journey-in-futility-h.1900681/

does everyone on that forums sucks the op's dick?

9700 Pro was such a fucking godly card, I'm just not sure the "Pro" moniker belongs on anything in the Polaris lineup given Vega's near future existence.

but yeah, ATI/AMD naming has been a shitshow at various times in the past
> Rage ('95-'99): Pro, XL, VR, GL, Ultra, MAXX
> Radeon 8000 ('01): LE, SE, Pro
> 9000 ('02): Pro, SE, XT, XXL
> X000 ('04): LE, SE, XT, Pro, GT, GTO, XL, XT PE, XT Platinum, XT Crossfire Master
> X1000 ('05): Pro, XT, GT, GTO, XTX,
> HD 2000 ('07): Pro, XT
> HD 3000 ('07): X2
> HD 4000 ('08): X2
> HD 5000 ('09): Eyefinity Edition
> HD 6000 ('11): [just numbers]
> HD 7000 ('12): GHz Edition, Boost
> HD 8000 ('13): [just numbers]
> Rx 200 ('13): E, X, X2
> Rx 300 ('15): X, Fury(X), Nano, Pro Duo
> X700/X800 (')

gddr5 is just ddr3.

Anyone integrated graphics here? Feels good to be master race.

> tfw Iris Pro

I almost came.

I'm waiting on the final reviews from independent sources, but if this is true for 200$ I might get one. or I might wait for vega

You get banned if you don't.

GDDR5 has a much higher bandwidth, 2GB of GDDR3 bottleneck any card since the GT 8000 series/ ATi 1000 series

$200 for 4GB, $230 for 8GB widely confirmed at this point.

I'll probably get the 8GB model ASAP then upgrade to Vega/GP102 next Spring or get a 2nd RX 480 depending on how things go with DX12 multi-GPU.

Might as well just buy the 4gb card if you are going to upgrade so soon.