AMD RX 480 Temperature leak

Some chinese youtuber leaked how hot the 480 gets under load. 71,9°C with the Reference cooler youtube.com/watch?v=4LBCCt4dejM

Other urls found in this thread:

wccftech.com/amd-rx-480-overclocking-tool-leaked/
vid.me/xs9x
twitter.com/NSFWRedditVideo

>no GPU clock
Wow its fucking nothing

72C is not hot.

New OCing tools confirmed, wccftech wasn't lying apparently.
Guess that also confirms above 1500MHz clocks too.

Neat

Compared to the 95 you can see on a 390x this card is a freezer

>reference

No way Jose

considering the crap tier cooler the reference card has, thats pretty good

its low enough that it wont throttle(ur mum) either

>chinese

You blind OP? Those are Korean glyphs

But it'll probably sound like a jet

probably with the fan at 100%

A card with this shitty heatsink is only going to 72 C on load. Even if it's at stock clock that's pretty good.

/thread

Impressive.
Gives me hope for undervolted passive builds

Why don't blowers have the fan closer to the middle? They can still expell air from the front that way.

Chinese streamer supposedly said it was very quiet, though no numbers.

wccftech.com/amd-rx-480-overclocking-tool-leaked/

I was lied too.

Some dual GPU cards do. It doesn't make much sense for single GPU though, since the GPU will be near the middle of the card, so you couldn't put the fan there. Even if the GPU were moved to one side, you'd need to have heat pipes going around the fan to move the heat to the other heatsink. It's just not practical.

I have a Nvidia 465GTX which idles at 76C (168F), so really under load a max temp of 72C isn't shit. I'm getting a RX470 for sure, can't w8 m8.

Because then the heat wont go directly up into the fins but instead has to go around. Putting the fan in the center of the card is only really doable with dual GPU cards with sufficient spacing between the GPU heatsinks to fit one, and even then such cards have a rather well earned reputation for running loud.

Why the fuck are you still running a GF100 chip these days?

do you happen to live in the core of the sun?

well shit, cant wait for third-party cards then

no he's actually using fermi

>unironically using fermi, the housefire meme gpu
you brought this upon yourself

Get ready for the amazing OCing

Fucking saved.

>72C under load

That's pretty good.
Hope they will lower it with reference card though.

Feels good to be a gang$ta

72 IS hot at stock

For a blower card? Hardly.

72C for any card is not that hot, and considering the cooling hardware involved its pretty damn good.

My last high end reference card was GTX 280 was going 80s C under load

Great to see RX 480 reference going just 71C under load

Yeah my heavily OCed 970 hardly reaches 65 at load when the ambient is almost 30

Nice digits

This. I had to put a scythe gpu cooler on my old card so it doesn't go above 80C. With stock cooler it hit 92C even with blower at max rpm.

Then again, that was an overclocked 6970 doing bitcoin mining.

72C at stock means that decent aftermarket coolers will average at the 60s.

I'm more worried about VRM temps, though.

It may be that the heatplate covering the card extends over the VRM chips, and that that general area has fins for heat removal. Yes, having the VRMs after the GPU will result in higher temps vs putting then before the GPU in the airstream, but with that much heat-mass to dump the energy into and the fact its been done before (several midrange reference Kepler cards on the green side did it, and I think its common practice for lower and midrange cards in general)

Friendly reminder.

The funny thing is that some people actually managed to twist this into a good thing.

Which card?

The (quiet) and (uber) should make it obvious.

290X.

>msi 480 cyclone edition
NOW'S THE TIME.
BRING.IT.BACK

That ref cooler was such shit.

I remember choosing between the msi cyclone and sapphire toxic 4890 many years ago.
Great stuff

friendly reminder nvidia caps their chips at 85C because they dont want you hurting yourself because you're incompetent.

Still tells us nothing. We don't know the clock speed, we don't know how hard the fans were spinning, we don't know how this cooler compares to third party ones. If the fans were spinning at 100% we at least knew how hot the gpu would have been under load but since we don't even know that this information is pretty worthless.

>Reference cooler

Why do you Sup Forumsinger's care what temp the card runs at?

Why did they discontinue it? It was pretty based.

>reference coolers are supposed to be sub 30 at load

>fe 1080 runs 80ish at load
>wow many cool. such temp

moral of the shit post is don't expect ref coolers to run cool or quite. And don't buy/PAY EXTRA for reference cards unless you plan to rip off the stock cooler and put some other cooling solution on it.

How are they getting away with such shit? It's old AMD stock CPU cooler tier, which is elaborate way of saying "garbage".
AND STILL it's supposed to be 72 degrees under load? What kind of magic is this?

>150watt tdp

nobody buys reference cards

right?

Does 1080/1070 not have a custom temperature target akin to Maxwell?

With a TDP that low and 71c at load....I'm really thinking about going reference. Its a fucking $200 card with cheap plastic and a sticker on the fan. If any thing I will rip it open and tie-wire a corsair fan on it.

Depends. I'm not paying an extra $100 for a 3rd party card to jew me out of overclocks that I can do myself for free.

GO AMD GO

Useful if you have a small case or otherwise have poor case airflow, as blower coolers keep the heat out.

Buy a third party, the stock is going to be limited on OC a bit

If you look at the reference PCB the GPU core itself takes all its power from the slot, and the memory from the 6 pin

You have to be literally braindamaged to buy a reference-cooled AMD card. This shit is loud.

Jesus, I did not realize that the fan assembly extends out longer than the PCB. That card is tiny. I cannot wait to see what to see what the custom coolers look like, and if any of them cut the length of the cooler to match the PCB, I won't have to think any longer about whether or not I'm buying one.

I need somewhere on the card to put my Amada Kokoro sticker.

Going to sound like a jet engine...

And current nvidia cards with their overheating issues dont?

It's got no flippin heatpipes. We can assume that a proper, high TDP cooler design will drop temps by at least 20%

What overheating issues?

this is the new oc tool of amd

kinda potato resolution but oh well

80C+ and throttling within 10 minutes of starting full load. Apparently nvidia hasnt learned their fucking lesson with Fermi: New architecture + new node + high clocks = ridiculous heat.

the issues that FE cards
are
throttling down after 10 mins to their base clock from the boost due to
1)being too hot 82-85c
2) surpassing the vbios p+ limit of the card

aka the card wanted quite a lot of more power that they connector along with the pcie could provide.. hence why you see 2 pins on the rest of the cards

...

muh mrsp

SCAAAAAAALPEEEEEERS
I wish Amazon did something to prevent it, they don't give a fuck.

First off, 83C isn't overheating. Secondly, it's designed to work that way. Nvidia cards aren't, and haven't been for quite some time, designed to maintain a specific clock rate. It is supposed to be dynamic, dependent on the temperature. This is exactly the function of GPU Boost or whatever.

If you could choose between a card that that starts off with a high clock rate, which gradually lowers as the card/the ambient temperature heats up to stay at its temperature target, and a card that defaults to the supposed minimum clock rate, which would you choose? While the performance of the former eventually lowers down to the level of the latter, it still achieves better performance before the card has heated up. And if the load on the card isn't that big, it'll stay at a higher clock rate offering better performance. This is the benefit of a dynamic clock speed.

Calling the intended function of the card throttling is just idiotic, and shows that you're completely tech illiterate.

The only and I mean the ONLY reason GPU Boost type firmware was created was to fuck with benchmarks and give that little extra 5-10% edge against competition. "GPU Boost" is just an afterthought from firmware-level temperature control. It's a waste of time.
Also boosting up when the GPU isn't running full load is absolutely useless because presumably you don't need that extra speed at that point anyways.

amazon is absolutely fine with getting money from the same product twice.

>The only and I mean the ONLY reason GPU Boost type firmware was created was to fuck with benchmarks and give that little extra 5-10% edge against competition.
Yeah, ok.

I doubt there's a point in trying to argue with someone like you.

That looks really nice also first time in gpu developer history that you can add voltage to your oc in an first party setup application

>This is what a low-knowledge Nvidishill who knows he can't win against someone who actually knows what they're talking about looks like

>be chink
>gets cardz gibbed
>signeru paperu with funny scribbles
>devil gaijin funny gibberish scribbles means nothing to me (my signature translated to 'fuck the whale eating virgin nips').
>get million hits on baidu.

them glorious chinks.

>tfw a chink neet makes you jelly.

>14nm
>new GCN architecture

its really not that hard to see why

whats the matter, I thought money was no object, you're not poor are you :^)

neat

damn son, that denial.
Nvidiots are officially mourning.

Here is the video reuploaded since youtube had the original taken down

vid.me/xs9x

The cards aren't "boosting," they're thermal throttling. Intel CPUs boost. They maintain that boost for extended periods of time. Unless you only use piece of equipment for 10-20 minutes at a time, the performance dropping after 10 minutes is ALWAYS A BAD THING. Like shit dude, what if we were talking about cars? What if someone was trying to sell you a car that couldn't adequately cool itself and would start losing horsepower as you drove it? You wouldn't fucking buy it because that's retarded and unsafe.

That said, Nvidia has a great high-end chip. If you put a good cooler on it and and clock it to the sweet spot, it will be great.

Is this throttling talk legit?
Surely the issue doesn't remain on cards without the Flounder's Edition reference blower.

Feel free to use 3rd party oc tools. Choice enough.

>The only and I mean the ONLY reason GPU Boost type firmware was created was to fuck with benchmarks and give that little extra 5-10% edge against competition.
>Also boosting up when the GPU isn't running full load is absolutely useless because presumably you don't need that extra speed at that point anyways.
Aren't these contradictory?
If the GPU only boosted when it wasn't full load then it wouldn't give any improvements in benchmarks. Unless the benchmarks aren't running at full load, which would then suggest that the games, which are often less intensive than benchmarks, also wouldn't be running full load.

Intel CPUs will maintain boost as long as the thermal properties remain favorable.
My CPUs base clock is 3.4 and it will comfortably sit at 3.9 most of the time, but if I run something like IBT it will drop to 3.5.

That said, a dynamic boost clock is more favorable to the CPU side than the GPU side because CPU processing is often done in bursts of a few seconds, while GPU loads tend to be much longer.
Yes, people do encode videos and things which take much longer on the CPU but overall that's not a typical CPU load.

Is CF viable?

No. If you do it then one GPU will be mostly loaded while the other sits mostly idle, as demonstrated by AMD themselves when they compared two 480s CF to a 980(Ti?) and it was showing about total 51% load.

Man, it's going to suck having to wait for 3rd party coolers.

You completely miss the point - those cards can't sustain their boost clocks in heavy loads (such as maxed out games for a benchmark) for more than 10-20 minutes. Benchmark runs are typically 1-3 minutes. Very few reviewers let the card reach total saturation before testing, nor do they keep the card thermally saturated throughout testing all games. So in effect we see these FE cards always at their boost state in tests, while in reality they tend to dip down to the stock clockspeed not even a half hour into a gaming session.

>aren't these contradictory
user that's just how the GPU boost works. It will let the card clock higher if power and temp limits aren't reached, regardless of if the card doesn't need to run that fast. Retarded behavior IMHO

It was 51% utilization in the light loads, which were CPU bottlenecked.
Some very smart people from the AotS company ran the numbers and figured a 1.83 CFX scaling ratio gave a 50% performance improvement in Ashes

pucker up nvidiots

I am no Nvidia fanboy. In fact I was hoping to grab a 480 myself. However we are forgetting something. The 1070 under load is gonna be using around the 160-180w mark depending on the load. The 480 is only capable of 150w max on the reference card. But I doubt it's running at that wattage in the video. So the question becomes: If the 1070 is running cooler than the 480 under load using a higher wattage it does not look too good for the 480. I hope this is incorrect though.

>It was 51% utilization in the light loads, which were CPU bottlenecked.
So basically it was rigged. Unless you expect to tell me the Nvidia card was also under light loads but with nearly 100% utilization.

Try playing the AotS benchmark some time, it was on sale last week. You are either trolling, very stupid, or just don't understand how the benchmark scores itself.

For you.

Get tiny ones you can stick in the middle of the fans when the custom aftermarket cards come out. The fans will probably only turn on when you hit 60C just like the 300s do unless you do a custom fan curve.

And this is for you.
Never fucking buy reference, not AMD, not Nvidia.

I don't care how it scores itself, I'm interested in frame rates and GPU utilization.

It's a shame since the new reference cooler looks really slick. I know that's a shitty reason to consider a GPU but I can't really help it.