Hurr durr 480 is as good as an 1080 for only 500$ nVidia is over

>hurr durr 480 is as good as an 1080 for only 500$ nVidia is over
Is this really how retarded you have to be to be an AMDrone?
It's an undeniable fact that it will draw almost twice the power of an 1080. This means in only 2 years, the 1080 will be cheaper at average, electricity prices.
Secondly, CF/SLI sucks compared to single GPU. It just does. Unsupported games, bad frametimes, bad airflow, more noise, bad G/freesync... If CF/SLI was a viable replacement, nobody would even buy high end cards to begin with, whether they're from AMD of nVidia
Thirdly, this is even what AMD are saying themselves you retards! They've officially stated that polaris is going to compete in low-mid end. We can all be glad that AMD has released a competitive card for the mid-end market. This is what most people buy anyways, and will put pressure on nVidia to lower their high-end prices. But that's about it.

Other urls found in this thread:

lmgtfy.com/?q=1080 power draw
twitter.com/SFWRedditVideos

the way it's meant to be shilled™

Point out anything that's not fact

There is no pressure, GP106 will be $199 and AMD simply can't compete with Nvidia in performance or power efficiency when they chose a bad Samshit 14nm LPP process

K E K
E
K

First of all 2 rx480 cost only 400 dollars and consume the same amount of power. Yes CF sucks but its also 400 dollars cheaper then a gtx 1080 and you get the same performance

Meant to say 300 cheaper

>consume the same amount of power
>400 dollars cheaper then a gtx 1080
desperately lying on an anonymous anime forum because your favourite video game brand is losing

still desperatey lying (or just completely ignorant)

CF/SLI might suck, but that's why the 480 is probably going to be properly configured for DX12 multi-gpu support instead.

1 gtx 1080 costs 699 and consumes 240 watts
2 rx 480 cost 399 and consume 250 watts
CF does suck and those points are valid, but the performance is almost up there with a 1080

OP is in so much pain!!!!!

That doesn't solve the majority of problems. Also, even IF you could just "hurr just DX12 it lmao" then nVidia would have done it too. DX12 isn't magic that's going to give you compability and 99% efficiency
>gtx 1080 costs 699 and consumes 240 watts
wrong
>rx 480 cost 399 and consume 250 watts
also wrong, you don't even know what you're shilling for drone. That would mean it consumes 500 watts in CF.
>the performance is almost up there
no, the frame rate is.

>This means in only 2 years, the 1080 will be cheaper at average, electricity prices.

Just how fucking expensive electricity is where you live? Assuming 150w per card, for 2 years I'd need to have them work at full blast 24/7, all the time at the highest demand pricing. And electricity at southern europe is already expensive as fuck.

I used the american average of 12c per watt. I was also more generous towards AMD than you. With your "full blast 24/7" @ 10c/watt it's 150$/year more expensive to own an AMD

Could you do me a favor and re-read my post before commentating

>2 rx480 cost only 400
>when nvidia announce MSRPs they're bullshitting but when AMD does it's fact
lol drone

Ok, I thought you were listing things since you usually don't write small numbers with a digit like a preschooler.
Anyways, You watts are still wrong for both cards. You're desperately lying. Third party 1080s are also less than 699, and we don't even know what the actual prices for the 480 will be.

Sigh... I cant believe I thought I was having a conversation with a normal person but alas he was a shitposter.
I guess sage and good bye

>facts are shitposting

CRAPSMANSHIP

This is literally all AMD cucks have to offer

>all these butthurt nvidia faggots
1
9
9

Who the fuck said it's as good as the 1080?
Who the fuck said it's 500 dollars?
It's 200 dollars you dumbshit and it's meant to be a good price/performace mid end card.
Why are you so retarded OP?

...

>beats a 980 at half the price

B A S E D

Ok, redid the math with a 50/50 split of our 0.19 and 0.10€ per kWh for high and low demand hours, turned out I botched the math and got roughly 175€/year. I stand corrected.
Still it's a worst case scenario. Hell I have my pc on 6 hours a day, and that's being generous. Assuming I turn it on everyday, I'd need 5 years at least.

Sup Forums said both of those. It's for CF
read again.

200$ are you to retarded to even listen when people tell you the price, and the 1080 can only be had for 700$ now, not some future number that may or may not happen, we will see if retailers jack up the price.

they weren't lying about the 700$, in fact we have people with receipts of 750$ post here.

>200$
>not some future number that may or may not happen
TOP KEK defusing your own argument there
>1080 can only be had for 700$ now
I've seen 630$ EVGA ACX on amazon

>it will draw almost twice the power of an 1080. This means in only 2 years, the 1080 will be cheaper at average, electricity prices.

>480 has same power draw as 1080
>thinking an electrictricty bill will outpace a high end card's depreciation
>ever

NVIDIOTS, everyone. Also, nice strawman greentexting in the OP faggot

see

if it only takes 2 mid range Polaris GPUs to beat a 1080 then nvidia is in trouble once vega rolls around

>>hurr durr 480 is as good as an 1080 for only 500$ nVidia is over
>Is this really how retarded you have to be to be an AMDrone?
>It's an undeniable fact that it will draw almost twice the power of an 1080.

Never says anything about CF until later in the post dumbass, and even then it's just general commentary about CF/SLI. Shut the fuck up if you can't even explain yourself clearly

Pretty much.
They're just retarded slavs and nigs happy a card they can buy with the money they've been slaving for years can match a GeForce in a synthetic bench when CFed.

>Never says anything about CF until later in the post
So you're one of those guys who finds it hard to read more than 3 sentences?

i dont get the appeal of the GTX 1070 and GTX 1080. 95+% of gamers aim for 1080p and 60fps. the 1070 and 1080 are indefensibly overkill for that.

2 mid range nvidia cards also get higher framerates than their top models. So you're not really making a point.
Also how do you go from "roughly equal framerate but with all the drawbacks of a dual gpu" to "beat a 1080"?

95% of gamer's won't buy them either retard. Or buy one and keep it for 5 years

And what about the appeal of a card that is an equivalent of a mid range card from 2 years ago?

>it's as good as a fury for 200$
Yeah right. AMD will gimp their 500$ cards because of you niggers. Stop being borderline retarded.

>overkill

at least for now
Its worth it if my 1070 lives for 4 years and was to play games on high or max settings most of the time.

My last expensive card was 9800gx2 on the same price level anyway lasted me 5 years before giving out.

>Overclock that shit and you'll have Fury X/980 ti beating performance for $200

Based AMD, they could have followed nvidia and put this nearer to $300

>And what about the appeal of a card that is an equivalent of a mid range card from 2 years ago?
because it can play every game at 1080p 60fps? for $200? a $200 card that can play every game at max at the resolution and framerate that quite literally every game on PC aims for?

every game released is built on the ps4s and xbox's 6 year old weak gpu. blowing $600 on a gpu is just dumb, you're wasting money for wasted resources.

>1070 and 1080 are indefensibly overkill for that.
For now.

If you're buying a videocard for videogames in 2016 you're retarded anyway.
Videogaming is fucking dead. Wait for 2-3 years for VR to pick up.

Holy fucking shit mostly everyone in this thread should kill themselves.

Also, where do half of you faggots reside? This thread started to early to have burgers in it... Oh wait, it is summer after all.

Fuck off to Sup Forums with this gay shit, seriously read half the posts in here, and tell me this doesn't feel like this is literally a school playground with a bunch of faggot fucking kids arguing about useless stuff and trying to sound like they are the least bit intelligent.

This is all coming from a tech illiterate NEET gaymur who at least respects the integrity of this board enough to just lurk.

>Videogaming is fucking dead
You've spent too much time on Sup Forums

Is hbm2 cards due next year or 2018?

I might intend to purchase a r480 for a cheap fix to my shitty gt630 and wait for the release of hbm2 cards.

>VR

nice joke, its never gonna take off

>VR

But VR is just the Wiimote/Kinect/PSMove for PC. It's little tech demos and minigames and you can't make elaborate games for it.

Then even less of a reason to buy a video card ain't it?

Most likely next year unless they get delayed again

480 looks like the perfect card if you're still on 1080p. Im picking one up for my current i5 2500k/7970/1080p 120hz build.

In 2017/2018 il probably make a new build with zen/hbm2 amd card and a 1440p freesync 144hz ips monitor

Will the 480 or 1060 be better for 1080p csgo?

I have a 144hz monitor, and right now I get 250~fps with a 7870 and 6300. I would like to get 325+fps.

I'll be getting an i5.

You obviously just wait for the benchmarks and see for yourself you fucking retard

This is pretty much the "logic" of buying a 1080 card in 2016.
Considering most game engines are optimized for 1080 and last gen of video cards it's fucking ridiculous to even imply it makes sense to buy a 1080 card now.

>game engines optimized for 1080
>droolinghomer.tiff

It's gonna be damn near impossible for Nvidia to beat the 480 at this level of performance/price. Then again, the 960 wasn't the best in terms of price/performance in its segment either.

The 60 variant has not exactly been a strong point for Nvidia since the 460/560.

Except for, y'know, future games instead of 4 year old games

>being this retarded
>amd and nvidia both draw more power then retailed

>amd is maybe 10 percent more than nvidia over draw

>i do not research shit and draw conclusions based on that one website i like and proves my point versus those that go against me.I am kinda like a SJW

Thanks


Hopefully I can squeeze it all into a pink sg13 :^)
Speaking of which, is there any reason to buy expensive noctua fans nowadays? It seems that cheap fans are just as good as expansive fans

But they are retard.
Most of the technology they use for LODs, tessellation, texture filtering, pretty much everything they've been doing so far to mooch a bit of fps from has been aimed at 1080 resolutions and bellow and for the old generation of cards and specifically with the limitations of DX11 in mind.

Buying a 480 for 1080p gaming thinking you're future proofing yourself or for the added 20 fps in a current title you'll get is fucking retarded thing to do.

If you have a card that already does 1080p around the 60 fps mark keep it and don't be a mouth breather.

By future games you mean console ports that are designed to run on 5 year old hardware

So what would you recommend to a 144hz gamer with a 7870 then?

The 1080?

You can't even spell or form a coherent sentence.
And none of the techniques you describe are resolution dependent

You left out the "terribly unoptimized" part of console ports

A better card won't help you much with that.
I've seen PS3 ports lagging on a Titan.

>60fps

lol it isn't 2010

>That doesn't solve the majority of problems. Also, even IF you could just "hurr just DX12 it lmao" then nVidia would have done it too. DX12 isn't magic that's going to give you compability and 99% efficiency

Can you explain to me what the "majority of problems" are? The multi-GPU support that DX12 has requires no agency from nvidia/AMD for CF and sli, you can run a lesser card paired up with a high-end one now without the high-end one being bogged down by the lesser card, for example. 1080 will still probably hold a lead in DX11, but DX12 gives AMD the edge. We've seen the effect it has already in AotS.

Nvidia hasn't made good design choices as of late because they're out of touch. Like, why do they keep shipping GPUs with tiny buses?

2 480s would consume ~300W afaik

So, if I want to end up running a 144hz 1440p monitor, I should probably go with a 1080 right?

>Can you explain to me what the "majority of problems" are?
literally every drawback mentioned in OP's post isn't solved by DX12
>The multi-GPU support that DX12 has requires no agency from nvidia/AMD
MultiGPU still requires per-game implementation. Which is something nvidia/AMD usually help devs with, so yes, it does involve them. Also it doesn't make a single GPU engine magically be able to run several you retard, only maybe 30% of a game runs through DX

>why do they keep shipping GPUs with tiny buses?
Why are you a specfaggot? The important part is how it performs in video games and how much it costs.

Poojeet, pls.

>Fuck off to Sup Forums with this gay shit
>gay shit
Are you literally 12?

>literally every drawback mentioned in OP's post isn't solved by DX12
The only one applicable is the energy costs.
>MultiGPU still requires per-game implementation. Which is something nvidia/AMD usually help devs with, so yes, it does involve them. Also it doesn't make a single GPU engine magically be able to run several you retard, only maybe 30% of a game runs through DX
Everything is on the devs implementation. Why do you think Crossfire/sli isn't achievable on DX12?
>only maybe 30% of a game runs through DX
Did you really just say games make 30% use out of an API?
>Why are you a specfaggot? The important part is how it performs in video games and how much it costs.
There's a reason why AMD has outperformed Nvidia at high resolutions for generations.

>The only one applicable is the energy costs.
Read again retard. Literally every one. Or do you think DX12 will make the physical airflow in your computer better, are you actually THAT retarded?
>Everything is on the devs implementation.
Like it is now
>Did you really just say...
no and I don't even know how you fucked up reading a simple sentence so bad
>There's a reason...
Like 5% run at those resolutions. And they've been pretty much neck and neck anyways.

big engines like UE4, Unity, Source, CryEngine, I bet will all get multi GPU support even if just the basics.

As far as im aware multi GPU support should be fairly trivial to most devs since its built into the API rather than an extension which was the case with SLI and Crossfire which is why most of the time it was never implemented.

Most engines are also multi threaded in this day and age so its just a matter of working out which render threads can be off loaded onto another GPU while the other is running its task. Hell you could also ignore rendering and just use the compute shaders and use it to compute physics and particle effects.

The only difficult bit will be ensuring both cards are correctly sync'd as I expect putting a super low spec GPU could cause slowdown if the single task was just too much for it in the first place. In DX12 I believe you can request feature support from the GPU so you could work out what it might be best used for or if to ignore it all together.

In my mind not using Multi GPU would be retarded for any dev and would be the equivalent of not running multiple threads on a multi-threaded CPU since the support is built into the compiler now there is no excuse not to use it if you know how to.

1. All of these are done to varying degrees and takes time and resources to do well. There's no magic DX12 button that makes everything work perfectly.
2. You're still focused solely on DX retard.
3. Even if UE4 ships with multi GPU support, many devs will implement their own systems that can only run on single on top of that

>even then it's just general commentary about CF/SLI
So you're one of those guys who can't even read a complete sentence?

>gtx 1080
It's 50 watts less than the 480

Actually 100w more than a single 480

>big engines like UE4, Unity, Source, CryEngine, I bet will all get multi GPU support even if just the basics.

Crossfire has been pretty well implemented on the developers since 2012.

I remember getting a second 5770 to put into crossfire, and holy shit that shit worked amazingly well.

>hi i'm a poorfag so i have to sacrifice performance for money
>hi yes amd for life!
amdfans

Just wait and see
What's the point of this speculation?
I have heard so many conflicting rumors the last 48 hours

>mid-end
Middle range cards can't be at the end.
The end is either high or low.

Oh wait I forgot its a pajeet card so mid-end performant all the way.

Cuck. I have a gtx 960 btw

>everyone says crossfire sucks
>meanwhile my dual 390x runs great with no problems in any games

get good

At least they have a monopoly on the Housefire market.

> owns a 7870.
> can confirm hot as fuck at default clocks
>>sometimes smells like burning plastic

Be honest OP, you live in your mom's basement and don't even pay the electric bill. You don't care about that shit.

RX480s have 1 6-pin each anyway. Power draw is fucking nothing. I don't have much brand loyalty if any, but holy crap you nvidiots sure are butt blasted about this shit.

I don't pay the electricity bill

Nvidiot kids are in this level of damage control becasue their moms will get them the 480 instead of the 1080.

How can you be this much in denial?

see

Hey, don't kill the messenger, I'm only stating facts here.

No, you're not. You're either an idiot or cucked into denial.
lmgtfy.com/?q=1080 power draw

Thanks for confirming that the 480 draws 100w less than the 1080.

>1080p
Ahahahaha poor fucking pajeets

The average price for a kilowatt hour of electricity in the united states is twelve cents.

The 1080 and 480 are both 150 watt cards.

If you use two 480s, that's 150 watts more than a single 1080. Over the course of a full year, at 90 minutes of use per day, that equates to 6.75 extra kilowatt hours per month, which totals a whopping $9.72 per year.

If you have any sort of balance in your life, you're not gaming for more than 10 hours per week. Even if you treat gaming like a full time job and do it 40 hours a week, the difference is still less than $40 per year.

There is absolutely no argument to be made here on power consumption.

You have single handedly lowered my view of AMDrones

It still draws extra power when you're not gaming tard. And not just idling either, even shit like Facebook has GPU accelerated parts

>ass blasted /thread