I classed it as mid-high range, 1070s being high end and 1080s being enthusiast.
Jayden White
Because they fucking suck.
William Mitchell
Up until Maxwell, Nvidia was the housefire company
so what gives?
Gabriel Bennett
>implying that would be a bad thing
Jonathan Gomez
Nvidia is a bunch of chinks and AMD is a bunch of poos
Logan Garcia
It's AMD top of the line, but depending on how stuff like the 1060 performs I think its safe to classify the 480 as midrange.
Camden Cook
>AMD top of the line thats coming in October Not expecting much though, everything they've been doing since 2011 has been a disaster and Polaris proved they wont be recovering anytime soon
Eli Nelson
Nvidia decided that they don't need the async/compute unit and throw that shit out of their gpu.
Ian Martinez
The 480 delivered on everything it said it would, and will only perform better as dx12 is adopted.
Only retards were saying the 480 was equal to a 1070. It has always been supposed to perform between the 970 and 980 and that's exactly what it does.
The only thing they fucked up was the 156W draw and saying fuck you to PCIe specs.
It's a good, cheap card that doesn't support the horrible company that is nVidia. It never claimed to be anything more than that.
Elijah Anderson
>cuck.jpg
Easton Taylor
It seems that the card is drawing all that power from PCIe because the motherboard allows it to. Its controlled by software, if the card asks for more than the mobo allows it just shuts off, if your card is powered by the PCIe only for example
Jose Ortiz
But the fucking power draw, man why is AMD so incompetent at that? Especially after the Fermi/Kepler distastes?
Henry Cooper
REEE SHILLL, guys stop replying to the butt mad shilingans. And has a better and cheaper card and nvidia shills working round the clock to compensate.
Leo Wilson
Even Adoredtv agrees the architecture is shit.
It loses on every front to Pascal, except for being cheap.
Which is exactly what we expected
Samuel Powell
Honestly, I think GCN is simply more inefficient by design. I don't think AMD will beat NVIDIA in efficiency unless they change the entire architecture.
Jeremiah Wilson
Yes I'm sure they care about you and your birthday card and christmas money very much user.
Wyatt Kelly
Theres nothing wrong with its power draw. GCN is a short pipeline arch not made for high clocks. AMD went with a smaller and cheaper to produce die for the mid range, then clocked it as high as possible to squeeze out all the performance they could at the cost of increasing power. The 480 isn't made to be power efficient, it in effect is a heavily factory OC'd card. It was made to be a high perf/dollar GPU and still have good margins.
We even know that AMD didn't bother clock binning the dies, and all voltages for pstates are considerably higher than necessary because of it.
Tell that to Polaris 11.
Samuel Ross
>thats coming in October No it isn't
Anthony Torres
1080 gets up to 82C
I'd say Nvidia still is the housefire company
Brayden Roberts
>Tell that to Polaris 11. I hope so, friend. I want AMD to do good.
Levi Murphy
>The 480 isn't made to be power efficient D A M A G E C O N T R O L
Easton Baker
Yes, overhype the next card
Keep repeating the cycle
Bentley Russell
Yup. I got to benchmark one the other day. It only got 6 scenes into Heaven benchmark before hitting 82C and the clocks slowly come down.
Liam James
>GCN is a short pipeline arch not made for high clocks
Which is why it's inferior
Jace Scott
Nvidia's entire business model is inefficiency
Everytime something new comes out every 12 yr old on Sup Forums magically becomes a veteran hardware design engineer with years of experience in multipole niche areas, it's really amazing.
You guy should be out making your own PC hardware and making millions.
....oh wait, you're just fat liars. okay, carry on then kiddos.
Connor Parker
The GTX 960 does exactly the same. On average the 960 draws less from the motherboard, but when it spikes it can draw almost 100W more than the highest the 480 ever does. Search it. 960s have been writing of boards for a while, and suddenly AMD do similar and they're retarded and NVIDIA are still awesome?
Jason Bell
There is nothing to hope for. The 2.8X perf/watt figure was for Polaris 11. Its a lower clocked card right in its sweet spot at 850mhz with a 0.8375v vcore. It competes against the GTX 950 at stock clocks, and being clocked lower it has more OC headroom. AMD showed a live demo of 11 back in January because that was their real impressive part. They gave press all the nitty gritty details right then, even showed off the die.
Polaris 10 in the RX 480 has a top pstate of 1266mhz with a 1.15v vcore which is absurd for the process its on.
>I can't deal with simple facts >I can't participate in a simple technical discussion >I just shitpost like a 12 year old on reddit
Very mature.
Owen Lopez
Nvidia knows they're losing market share big time so they need easy retard memes to latch on for spergs to repeat ad nauseum so they feel superior about spending $700 of mommy's money on a card to play minecraft.
sad rly, tbqh
Aiden Davis
2 years to go from 3.5 to 4 am i right?
William Ramirez
>le 3.5 Still BTFOs any AMD card
Nicholas Hill
Yup, first it was "muh thermals" and then it was "muh power usage" they just trot out sad excuses as to why you should allow yourself to be ripped off.
unless you get a good deal on a 980ti you're retarded anyway.
This latest Nvidia series is a paper launch.
There aren't even any games worth playing that have DX12, some half assed demos are about it.
VR and 240 hz monitors aren't going to make new cards fly off the shelves no matter how big of mindless consumer whores everyone is these days.
Isaiah Edwards
I'm seriously starting to think that nvidia is afraid of the 480, otherwise they wouldn't need to thrash-talk AMD this hard lately.
Zachary Ramirez
amd are communist chinks actually nvidia are freedom chinks
Thomas Lee
Oh please, there were tons of "Nvidia is finished and bankrupt" posts during all the 480 hype. Now we're just rubbing it in your face that the card isn't what you guys hyped it up to be.
Adrian Reed
>losing marketshare >80+%
Robert Hall
It seems like AMD chose a "bottom up" design approach with Polaris, which is what made the larger chip suffer in efficiency. Scaling out from a target design point, which in this case is absolute minimum draw, usually means the larger derivatives are less efficient than theoretically possible.
I hope that Vega was a "top down" approach instead so their halo products will really be the best they can make.
Landon Perry
This card is suffering greatly at its launch because of the smoking pile of shit that is the ref cooler. The memory and die are heating up as the anemic little block of aluminum sits on them. The VRMs are absolutely cooking. The hotter they run, the more voltage they need to remain stable.
There isn't much binning going on either, most of the cards are severely overvolted for the sake of production speed, as it's sure to be stable at that voltage. That just makes them hotter. Many of these burning hot reference cards can actually be undervolted to drastically cut the power draw and heat.
It's a pretty bad way to introduce the card. With partner boards, though, things should get a lot better. Better coolers applying to all the components (lookin' at you, VRMs) in addition to just not sucking, should tank the temps on this while allowing it to clock moderately higher. Maybe even with lower voltage thanks to the reduced temperatures.
>tl;dr reference a shit, go AIB partner or go home
Benjamin Ortiz
The Polaris 10 die isn't any less efficient in CU throughput per watt than 11 would be if they were clocked similarly. Its just clocked high to reach a performance target, and make up for what its lacking in transistors. If it had 64 ROPs and were clocked lower it would perform considerably better. Thats the only area where its lacking compared to Hawaii/Grenada, its just pixels/clock. AMD decided to squeeze more out of a cheap midrange die.
I've pointed this out before, but if AMD wanted a 150w~ performance king they could have ported Fiji to 14nm. Fiji in the Fury Nano averages 186w in a 4K gaming bench. The only difference between the Fury Nano and FuryX is their average clock rate since they both use the same boost mechanic. If you disable it in the Nano and set it to a fixed clock about 800mhz it would already have the save average draw as the RX 480 and it would perform better. The only problem is that it wouldn't be cheap.
Ayden Morales
>The Polaris 10 die isn't any less efficient in CU throughput per watt than 11 would be if they were clocked similarly Source: your ass
Kayden Lewis
Its the exact same architecture on the same process. A transistor in P11 doesn't magically need any less voltage to switch at a given frequency than in P10.
Shitposting like that just makes you look like a retarded edgy kid. Which you undoubtedly are.
Nathaniel Nelson
GREAT IDEA TO MAKE ANOTHER GPU THREAD
Isaac Gutierrez
It took AMD 2 years to make a 970.
Truly amazing.
Camden Baker
>what is a 290x/390
Liam Mitchell
Are you 10?
>the 680 performing the same as a 960 implies there's no progress
All you who push this shit meme need to grow a fucking logic gland.
Luke Murphy
970 doesn't draw 250W.
See above, but change the card and the power draw.
Jace Powell
actually we can blame a lot of stuff thanks to beyond3d
as in >we know that nvidia has stripped much of its hardware in order to make it effiecient >nvidia doesnt have any sort of hardware scheduler they were emulating everything on dx12 >when the async fiasco started on august 2015 people started to dig in and they found out that it has a small arm chip handling the thread request staticly >they dont have any sort of dynamic shared more on their cores they are static as fuck >their cores cant flip or switch workloads midcycle thus creating a large lag >we have seen the very same behavior on paxwell as well >nvidia will never enable the cuda cores on dx12 for maxwell because that will mean they will have to increase the power draw of the card in order to do async >they miss key dx12 features yet they claim dx12.1 compliance they might forget that they need full dx12 compliance first in order to have dx12.1 features... >nvidia literally built a uarch around dx11 and that was good for them as a company not really good to anyone else.. >meanwhile now you have >dx12 being built around amd >consoles being 100% locked >amd cards have naturally a lot more hardware on them to accomodate dx12 features >amd finally added support for sub pixel culling >they boast a lot of hardware and yet they managed to be on par.. i wonder when volta comes and its going to need to have a hardware sc to get good async perf how they are going to justify the power increase.. also something from the past to laugh at it legitreviews.com/nvidia-highlights-directx-12-strengths-amd_138178
Jose Garcia
>Release shitty product >Make it not look shitty through advertising and paying off devs >Get 90% marketshare
Fucking Nvidia, man.
Benjamin Hughes
>tfw everyone said it will be like Fermi vs 5000, pascal is a high clocked housefire, 14 nm polaris will wreck its shit fucking sad
Liam Campbell
>release a good product >amd advocates start spreading lies and misinformation on the internet because amd can't keep up >get 90% marketshare anyway Fixed that for you.
Jaxon Turner
>gameworks is shit is a fact >nvidia gimping cards is a fact >nvidia not being able to do dx12 is a fact >nvidia lying to everything is a fact >nvidia buying off devs is a fact >nvidia launching a big ass festival that talks about vr only all the fucking time and 2 days before polaris gets launched jhh goes and says vr is 20 year ahead is a fact >nvidia getting their ass kicked on the mobile soc check >npx releasing a chip that is faster bigger stronger and uses half its power than drive px2 check >ibm releasing a neuro chip that uses 1/3 of the px2 power check >nvidia will get its ass kicked on every single possible way from many companies check >nvidia is about to lose the only big licencing deal with intel check >nvidia lost samsung deal to amd check
Jackson Jackson
>match It's drawing up to 200w from the PCI-E hoping reviewers won't notice. It's literally drawing 160+ watts from a 150 rated power draw from 6 pin ATX and the PCI-E
If you buy 2 of these cards like AMD suggests and OC. I'm literally going to laugh at how many fagets are going to complain on Sup Forums about melted mobos.
Keep it quiet though, let the numbnuts live in denial while their pair of $200 gpus pull 400w through the motherboard and melt it.
Zachary Sanchez
another paid shill that thinks amd broke ohm law and literally started to draw more power than its physically possible
William Hall
I've always liked these hi-def renditions of closeup microprocessors, its amazing
>We skipped long-term overclocking and overvolting tests, since the Radeon RX 480’s power consumption through the PCIe slot jumped to an average of 100W, peaking at 200W. We just didn’t want to do that to our test platform.
Chase Watson
Then show actualy evidence backing that up
Dylan Scott
>"2 480's will beat a 1080! at significantly lower price!" >"We forgot to mention it also burns your computer down"!
NEWEGG STILL HASNT SHIPPED MY 480 EVEN THOUGH I ORDERED A MINUTE AFTER LAUNCH FUCKKKKKKKKKKKKKKKKKKKKKK
"PACKAGING"
Dylan Jones
>I'm a tech illiterate retard with no basis to argue from >YOU NEED TO PROVE THAT THE SKY IS BLUE HURRRRRR
Thats cute.
Colton Lewis
yes and who is right? tom that said 200 watts from the pcie or tech that said 10 watts more? what is more plausible? amd to have built a fucking death star according to tom or tech?
Jonathan Mitchell
>no evidence Don't make statements you can't back up kiddo
Gabriel Reyes
It said 200w peak, everyone is reporting 100w draws at overclock. Which is probably what it's drawing. The fact that it's trying to pull 200w from a 75 rated watt slot is already worrying enough.
Leo Williams
>these two GPUs have totally different electrostatic characteristics per transistor >because I said so >despite being the same architecture >despite being on the same process >despite reaching the exact same clocks at the same voltages
I don't need to provide evidence for anything. You asserted that they are different, the burden of proof is on you to back up your unfounded childish nonsense. Tech illiterate retarded child.
Robert Gray
thats the problem there is no possible way for someone to measure the fucking pcie directly search all you want currently there is no system for it and yet we see pcper with a heavly modded pcie slot outside of the motherboard.. gee i wonder who supplied them.... pcper who literally got 1080 power draw wrong now has more state of the art machines than toms hardware.. lel
Isaac Gutierrez
>there is no possible way
What is a riser card?
Test Method Contact-free DC Measurement at PCIe Slot (Using a Riser Card)
Ian Bailey
...
Grayson Ramirez
ideally yes problem is that its not that simple add in cards like the pcper showed its identical to the ones amd and nvidia uses and they have logic memory on them they are more expensive than any of them could afford to have...and yet they somehow got one.....lol
Dominic Bell
>When we zoom in we find that the motherboard is actually providing more than 95 watts of power over the +12V line and maintains the 5 watts from the +3.3V line, proving that we are indeed getting more than 100 watts through a PCIe connection that is only rated at 75 watts
AMDicks are burying this hard, enjoy pulling 200W through your mobo from a 75w rated slot
Logan Campbell
But it says right there its pulling 100W
Asher Wood
>Pascal == good Wew, what is this new meme. Pascal is shit. It's just Maxwell with a new name.
Elijah Lopez
Okay. So one source said 100w peak draw from the slot, not 200w. Pci lanes are controlled by the motherboard regardless.
The 750 ti regularly drew over 100 watts. Peaked at 145 watts. I'm not seeing the issue. Especially since that card would have been on older motherboards.
Or cheaply built and aged prebuilts and workstations.
Jacob Parker
Anything below 90 doesnt matter.
Camden Campbell
Yeah that Samsung deal sure cost Nvidia. Which is why their cards overclock and AMD's cant
Evan Mitchell
chinks > indians at making GPUs
Matthew Fisher
>Nvidia looked at kepler and cut as many corners as possible to make it good for gaYmeeen
Ayden Baker
Name one use for AMD aside from cheap and shitty gaymer cards
Just about any good program has CUDA acceleration
Kevin Rivera
AMD has issues with software that's implemented by INTEL and its supporters in order to make AMD look bad and ruin their competition from what I was able to garner. Whenever I have an issue I find its because of code and not hardware.