So I've been on the red team for a while...

So I've been on the red team for a while. All three of my AMD cards have had very slight problems and I think I'm finally tired of it. Now that the GTX 1080 is out, which one should I buy?
EVGA, Gigabyte, or MSI?

Other urls found in this thread:

en.m.wikipedia.org/wiki/List_of_games_with_DirectX_12_support
forums.geforce.com/default/board/172/geforce-1000-series/
twitter.com/NSFWRedditGif

Wait for the Rx 490, it's gonna rape the GTX 1080.

The Rx 480 already left Nvidias asshole bleeding and brutally bruised.

Are you sure though? Top end AMD never beats top end Nvidia.

>wait
>believing in AMD after the shit they just recently pulled
Please stop pajeet.

Gosh, yes, that is a puzzling conundrum fellow unbiased consumer. My sincere advice is that I'd recommend buying two GeForece GTX 1080s so that you can experience two different brands. With the low, low price of the GeForce GTX 1080, there's never been a better time to run an SLI setup!

I don't know why anyone would buy anything other than gigabyte. historically and factually always the best build, the best card, best overclocking, and sickest looking

>I don't know why anyone would buy anything other than gigabyte

Gotta agree breh. Never had a prob with any Gigabyte stuff

Pretty sure. AMD is going all out with the Rx 490.

Rape would be putting it lightly to what AMD plans to do to Nvidia with the Rx 490.

Pic related is possible specs. Launch date is rumoured to be Q3 2016 now.

Do they just suck at making AMD cards then? Or have I been misinformed about that? I've always bought sapphire for red cards.

>Pretty sure
Your golden cow is dead pajeet, stop building another one.

>the 390 is faster than the rx480

>let's wait months and months until all the enthusiasts already have a 1080, then we'll release our mega expensive tier 1 card for sale!!!!! so many people will buy it!!!!!!!!!!!!!!

Their first 14nm offering doesn't even beat the majority of Nvidia's 28nm lineup from 2 years ago. Please ignore the pajeets.

EVGA is always the best bet with nvidia cards, unless you find a deal or are otherwise attached to some other manufacturer. EVGA has always had close (exclusive) ties to Nvidia, and EVGA tends to use the reference designs for their cards which are more solid and reliable.

see

>dude just wait until the heat death of the universe

>Their first 14nm offering doesn't even beat the majority of Nvidia's 28nm lineup from 2 years ago. Please ignore the pajeets.

Using DirectX 11. Directx12? AMD thrashes novidya

>See shitty pajeet infographics done by a a 12 year old defending mommy's purchase

kys

Get EVGA.
Never. I'll enable Vulkan when its avaiable and switch to Linux.

I consider buying Vega for my 4K project, so better deliver some fast GPU that can deliver 60 FPS Ultra 4K DX11 performance.

Yeah let me just play the 2 games that actually run dx12

My gigabyte gaming 5 z170 board has shit fan controllers

>DX 12
>less than 10 games announced for the next 2 years

Who the fuck gives a shit about DX12?
99.9% of games run DX 11 and under.

en.m.wikipedia.org/wiki/List_of_games_with_DirectX_12_support

hnmmmm

It's like you hate the future. Get with the times, you damn luddites.

>Kings of Wushu
>W.N.C infantry
>Microsoft first party shovelware
>Indie garbage
>Watchcucks 2 because the first one was so good

Wow, I can't wait to play all that garbage at 60FPS 1080p

>Hate the future
I'll buy the cards of the future thanks, I like enjoying the games I have right now. You can waste your $200 on AMD dogshit right now and wait for DX 12 to become relevant in the next 4-5 years

Protip: the card will probably be worth $20 then at the budget bin.

Not a single game that is wroth a shit

>buy hardware right now for the future

Top kek pajeet kys right now

laughed at this post

>480 is gonna be between 970 980
>paid nvidia shills overhype it to pascal titan levels
>card gets released exactly where amd said
>paid shills are crying because amd delivered
>paid shills found a problem with the card went overboard
meanwhile
>paid shills dont even talk about their own problems
>1070 cant boot the system
>1080 has lower perf after 10 mins
>dual link dvi is not letting people to boot
>display ports randomly goes on rampage and black screen the system
>idle power usage is bigger than they advertise
>gsync on 144hz restarts the drivers
>drivers 7 months now are randomly killing cards (i guess its a feature for them to move to pascal)
forums.geforce.com/default/board/172/geforce-1000-series/

paid shills cant defend this shit longer eventually its going to be another async fiasco on their asses

I hope you noticed that you are comparing last gen cards with the newest AMD line up.

Lets see how the RX 480 compares to the GTX 1060

>talk about their own problems
What problems are those?
Seems like you have a lot to say about Nvidia's problems while ignoring the fact that every major reviewer and AMD themselves have acknowledged that the PCI power draw.
>Nvidia shilling for AMD not retarded AMDumbs hyping up their own damn card
>this entire wall of text by the retarded AMD shill defending a shitty failure of a product
BBBBBBBUT NVIDIA!!!!!!!!

No you stupid designated poo eater. I own a 290x and you're not shilling me into buying into the DX12 meme

Sounds to me like a biased, narrow-minded, closet homo excuse desu familamam. Get with the times grandpa and stop relying on DX11 which is already ancient. DX12 is here, the future is here.

Protip: The RX480 will be remembered as having ushered in the death of n-plebia

in dx12? given how bad 1070 is doing well lol

>biased
>the future is here
I'll keep waiting for the future and using the cards of the present thanks.

I have a dumb question. Why are there no x90 nvidia cards? Like as far as i go theres 960, 970, and 980 but 990 isnt a thing. There a particular reason they chose this naming scheme?

>grandpa

Is that the best you poo in loos can do now?

I'll stop relying on DX 11 when all the games developers stop relying on DX 11 kthx. I'm not going to buy a card just because some poo in loos are butthurt by a green company.

Get with the times, you bunch of hipsters.

I thought this was a 18+ board.

>every major reviewer
>toms
>a random french site
>pcper->nvidia pr site
let see
>toms 80-85 on a 75watts pcie slot 9%
>random french site 133 watts >lol
>pcper 80.5watts on 12v rail of 66 watts-> 20% increase->lel->they dont even try to at least a bit non bias->total power of pcie 66watts!

meanwhile the card is still perfoming as intended as long as you dont fucking mess with the voltage while you are on stock clock literally i wonder who was the idiot that did this in the first place

>i own a 290x literally the signature of a paid shill every single one of them have a 290x

Every major reviewer with the EQUIPMENT
AND KNOWHOW to test this out has done so and found the same result fuckface.

Yeah, you're lost. You need to go back to >>>tumblr

first full dx11 came 2 years after dx11 was released
first full dx12 came 8 months after dx12 was released..
every major AAA game of 2016 is on dx12
keep waiting for nvidia to find a way to release more dx11 games
>lel

>You lost
>Says the children talking about non-existent DX 12 games

Summer is here, I feel it in my bones

>every major AAA game of 2016 is DX 12
Why do shills like to lie?

hey tard did you even read?
both of them found the card was drawing 80 -84 watts
the problem is pcper being an nvidia shill site as always instead of actually showing the entire 75 watts of the pcie (2 rails) they showed only the 1 66 watts on the table that alone resulted into the wattage to the eyes of simple apes to go into the stratosphere
literally throwing shit just because >muh nvidia money

>It's only 20% over the specified rating!

This is what the shitty AMDerps have become.

x90 cards normally are dual-gpu cards,

>wait for poo in loo

prove me wrong
yeap amd is shit not a site that knowningly falsified the result
lel
paid drones all over Sup Forums

Yes and the PCI-E specification is rated for 66w with a 9% lee way that's 75 watts. And even then they managed to breach those standards AT STOCK SPEEDS.
LITERALLY RISING TO 50+% OVER THE RATING WITH OVERCLOCK.
Jesus fuck even AMD has acknowledged the issue. Will you stupid shills just accept that AMD fucked up and move on?

>Falsified the results
Yes major independent review sites are falsifying test results because they don't favor AMD.

Please take your retardation back reddit

Reading comprehension is not one of your strengths, is it grandpa?

please give me a link that shows pcie having a 66 watt spec

please do so
oh wait you cant? what you mean you cant? but you said its "MAJOR INDEPENDT SITE"
how can they lie!

wow its real

Top kek

Hey fucktard. There's these things called 12 v rails right? And the PCI-E slot is rated for 5.5A so you take the 12 v and multiply it by 5.5A to give you 66w like how math works right?

After that you take the result shove it in your anus and kys.

looks like someone needs a math lesson

you fucking idiot all the pcie slot are dual rails
one 12v and one 3.3v those combined gives 75 watts at MINIMUM k? k
now when a paid shill site says the card had a maximum draw of 80 watts on pcie and then goes and show only the 12v rail this is false in a universal scale because you artificially increase the wattage over 12% which IS NOT TRUE

take 12 v and multiply it by the rated 5.5A for the PCI-E slot. I know grade school math is hard.

NVIDIOTS BLOWN THE FUCK OOOOOOOUUUUUUTTTTT!!

Yeah, you see the problem is that the 480 is taking 6-9A on the 12 v rail fuckface.

>meanwhile at AMD

>My motherboard is destroyed after playing games, thanks AMD.

Wow NVIDIA is shit.

>because nvidia cards also have problems it doesn't matter if mine does too!

you're a fucking cuck, report the god damn problems like these guys do instead of slurping Poojeet's poo, it's a fucking tech corp that doesn't give a shit about you, not your mom

>math lesson
The math is 12 x 5.5A = 66w fuckface

no sir
the problem is on their whole vomit of article they claimed that the card was drawing 80.5 watts on stock clocks at the pcie slot

now we know that the pcie draws 75 watts

later on on the second page almost at the bottom you see a table
the table instead of showing 75 watts pci slot->480 80.5 watts
its saying
12v 66 watts ->480 80.5 watts which is not true simple as what
the pcie slot is a dual rail one since its inception as a standar and pcper just literally changed it to suit their green sponsor nothing less nothing more
the 12v accounts for 70% of the pcie slot power the second one gives the 30%

haha oh dear

>nvidia
>best company
>never fails

...

>claimed that it was drawing

They used this thing called 'measuring equipment' like 'riser cards'.

Please stop talking about what you don't understand.

The thing you can measure is the CURRENT running through the board. You get that and multiply it by the voltage to get the watts.
The problem is caled OVERCURRENT The watts are simply a measurement of the power. The problem is the current running through the motherboard causing fucking heat issues. PLEASE GO LEARN SOME FUCKING GRADE 10 PHYSICS REEEEEEEEE

>now we know that the pci-e draws 75 watts
No, the rating is 12v @ 5.5A with tolerances of about 8-9% bringing up that 75W number

the 480 is drawing 6-9A at the 12 volt rail. It will literally melt your motherboard with prolonged use.

I wonder who could be behind all this FUD

no?
NO?
are you actually denying that the pcie slot is a dual rail one since its fucking was pcie 1.0?
ARE YOU ACTUALLY SAYING THIS?

>dual rail
The PCI-E slot is not the same as the 6 pin/8pin connector you stupid prick.

Nvidia the N is for Quality

...

niggerVidya

...

>GTX1080 won't have problems
Have fun pissing your money away OP.

god damn do you actually know wtf are you talking about?

the fucking PEGS can draw up to 480 watts

talking about throwing the discussion into orbit

>can draw up to 480 watts
The mobo can't take that amount of Amps going through the fucking slot you fucktard. It's going to create enough heat that the slot can't get rid of and destroy the slot and the surrounding hardware. It has already happened, AMD have already acknowledged the issue and are trying to fix it, but you and your fucking retarded brainwashed red or dead mind can't accept the fact that AMD fucked up with the PCI-E slot power draw

DX12 is where the AMD cards shine and every one says "but theirs only 3 DX12 games" which is a poor excuse because after Battle Field 1 drops its going to have an a major effect on the industry..

EVGA has the best customer service, MSI has the best AIB. Their fans are completely silent. I bought the MSI personally.

man
ok listen i know you must be like a yeti trying to understand what a wire is but

when someone is talking about the pcie slot
and then someone brings the pegs into the conversation is just dumb
no its beyond an insult we must actually find a new one
(i vote to call you by your name)

tl rd
pcie slot gives 75 watts
pcper says the cards on the slot gets 80
pcper later shows that the card got 80 on the 12 v rail completly removing the second 3.3v slot of the pcie
he literally said pcie is 66 watts
end of story
no fucking pegs
no fucking maths no nothing he deliberately changed the number to suit his need

oh no you are waiting for the wrong game here sit...

just have a look at the new deus ex...completly rendered on the compute path of async.. paxwell users will just call it "broken" just like they did with quantum break because it couldnt handle async

NO

>PCI-E slot is 66 watts
No, he said the card is drawing 80w from the slot, please stop lying.

OP here.
If waiting for vega is the best option, I'll do it. Both seem to have their problems, but I still want to make the best purchase for my money.

Heres an example of Nvidia at work. Every DX12 title runs really good on AMD hardware. Rise of the Tomb Rader is a Nvidia exclusive. Guess how it runs even with DX12. Fucking Nvidia man.

I wonder what the results would be if Pube Hair was disabled?

The highest power draw I measured with the RX 480 at stock settings showed 80-85 watts of power draw at over 7A on the +12V line and 4.5-5.0 watts of power draw on the 3.3V line. These were consistent power draw numbers, not intermittent spikes, and users have a right to know how it works. When overclocked, we witnessed motherboard PCIe slot
+12V power draw at 95+ watts!

PCIE SLOT MOTHERFUCKER SHUT THE FUCK UP ALREADY NOBODY IS TALKING ABOUT POWER FROM THE PINS JESUS FUCK YOU'RE INSUFFERABLY RETARDED.
THE GRIPE IS THAT THEY'RE DRAWIN OVER 75 W FROM THE SLOT NOT THE PINS THEY'RE VIOLATING THE PCI-E STANDARD AT THE SLOT

obviously you dont know shit about hardware do you know i suggest you start from this
he literally said pcie specs 12v +-8= 66 watts
while the rest of the world that do have a simple understanding about hardware knows that the pcie is 75 on dual rail configuration since before chirst was born

wat is dual rail?

12 v at 5.5 A is 66 watts add the tolerances and you end up with about 75W.
We are talking about the slot not the pins.

the slot not the pins
the slot NOT THE PINS
THE SLOT NOT THE PINS
meanwhile on pcper site
pcie specs 12v +-8
66 wats !

the RX 480 was just a Nvidia ruse cruise
Nvidia shills overhyped it as having gtx 980 performance so that Nvidia can use the whole "1060 will have gtx 980 performance" campaign in hope of pulling people who fell for the hype

It literally says slot there.
>66 watts
Yes 12 multiplied by 5.5 is 66 watts.

Wouldn't Novidia just need a driver to address the DX12 performance? Pretty sure it isn't hardware related.

There we go, shitty PSU, 11.45V on the 12V rail, that's 0.8A more to make up for the lacking Voltage.

dual rail is
the pcie slot gets from the 24 pin connector of the motherboard
1)12v 66 wats
and
1)3.3v 9 watts
the whole thing on this thread is that they dont want to admit that pcper is deliberately misleading everyone by saying pcie specs->66 watts which is not since the first day they created pcie..
now normally the card draws 80 watts instead of 75 and that is 9% up but if you change the number to pcper one and do it at 66 you will get an increase of 12% which is stupid by any standar