GTX 1070 faster than a 980ti and Titan X

GTX 1070 faster than a 980ti and Titan X

AMD is finished

Other urls found in this thread:

pcgamer.com/total-war-warhammer-benchmarks-strike-fear-into-cpus/
wiki.totalwar.com/w/Optimisation_Blog
kitguru.net/components/graphic-cards/anton-shilov/sales-of-desktop-graphics-cards-hit-10-year-low-in-q2-2015/
twitter.com/SFWRedditGifs

Alright Sup Forums. Let's try something. Let's just not fucking respond. I believe you can do this.

>AMD's cards haven't even been announced yet
>the 1070 only has leaked benchmarks, we just don't know the full picture yet
>fanboys are still going to argue like they're both totally sure of their information

Why is it always like this on male dominated sites? You never see women at each other's throats because they disagree with someone else's Linux distribution or video card preference.

That's cuz bitches can't Linux

R9 380 4gb SOC bro here


For the price I expect it to score be very high family.
good so when i upgrade to the new 1000 series memecard I can look forward to good things :)

wait for poiloris to have a objective proforance result though.
Brand loyalty is for plebs

If this holds up 379$ MSRP is a steal for this kind of performance, brand loyalty or not

if its anything like the 970 launch, it will be sold out for months and if you do buy one, expect to pay a premium just because people can charge that much.

>If this holds up 379$ MSRP is a steal for this kind of performance

Yep ;)

Can we?

>mfw they said the same thing about the 1080

>AMD's cards haven't even been announced yet
Is that an excuse? Lol

>using a biased game that heavily favors AMD

Are you retarded?

Whatever you Gefucboi Nvidiots do, at least wait for third party overclocks with aftermarket heatsinks.
It's already been shown that most of the good third party 980Ti perform better than the reference 1080.

Not him but
>Bias
>980 Ti and Fury X are basically on hairs breath with each other
What the fuck are you smoking?

its widely reported that AMD worked closely with the devs of the game.

And?

AOTS is an AMD sponsored game but the devs worked more closely with nvidia. What's your point?

Are you intentionally ignoring ?

>dat 4k score within margin of error of FuryX
lop tel

You mean any true DX12 game?

>HBM IS JUST A MEME
>V-VEGA WON'T SHIT ALL OVER PASCAL
Enjoy having the strongest mainline card on the market... for the next four months anyways.

>AMD's cards haven't even been announced ye
They are late to the party as always, and it will cost them

Im just sitting here with my cock in my hand refreshing amazon trying to grab an EVGA 1080

That Total War DX12 benchmark was done with a demo benchmark supplied by AMD, not with the released game. The game only has dx11 and the 1080 wins there (pic related).

Read the article before all falling for this bait...
pcgamer.com/total-war-warhammer-benchmarks-strike-fear-into-cpus/

>A-AMD will w-win n-next time g-guys!
It's almost like I've heard this before

>AMDrones BTFO with lies misinformation once again

Classic

>buying on the first day, hell even the first month
You like being robbed?

Looking to get a super clock version of it, how much is that gonna decrease in price over the next few months? Not much if any

here's a crazy concept. Why don't you wait until team red releases their shit before claiming they're finished?

here's another thought. Why don't you just do research on all your options and make your purchase decision based on whatever option best fits your need?

Fucking crazy right?

Then we can delete all these autistic "X on suicide watch" threads and have adult conversations.

Everything that's leaked so far suggests polaris 10 will be mainstream.

3d mark benchmarks have already been leaked of polaris 10 so we already have a pretty good idea on how it will perform.

In between fury and the 980.

So there's plenty of concrete stuff to base some estimations on already

Nigger it's in five and a half hours.

AMD will manage to respond with something kinda competitive with Vega, but when you see the roadmap for Nvidia with the 1080ti and what follows.. it isnt looking for AMD in the medium term.

Again, this is for pc master race, company wise AMD will remain healty thanks to console sales which is also making the company not caring about high end. At this rate AMD will end up making graphic units for cellphones.

>>A-AMD will w-win n-next time g-guys!
fury x over 1080 looks like amd is competitive at the very least when nvidia doesn't gimp the game.

>Want to get a 1080/1070
>P8Z68-V PRO Gen 1 w/2600k i7
I should just buy a new fucking PC.

It's always in the future with amd cocksuckers. Never here, never now, always in a vague future with a lot of wishful thinking.

This shit is fucking hilarious. I've only been hearing this parole from then since the bulldozer days.

I'm gonna be up all night anyways, working weekend nights has got me fully nocturnal

only thing we don't have an idea about is price, and the range can take it from 200-300$ or if you like nvidia 400~$ because it has to be priced higher than a fury if its faster than one.

the price makes or breaks the card, granted we also don't know what the 1070 will be priced at either, and not the msrp, what the card will actually sell for.

>wanting to upgrade from a 2600k

You should buy a new brain instead.

>looks like amd is competitive at the very least when nvidia doesn't gimp the game.
Looks like you didn't read very well
see

bulk sales are not high end cards, chasing that is just bragging rights when the its a sub 5% market.

yea, i skimmed the article, they are putting out a full dx12 patch in the near future and are showing some of the benched results.

>This shit is fucking hilarious
It is

So what does it matter when you can just play it on dx 11?

It seems like most games coming out that have dx12 also have a dx11 problem.

Seems like the best of both worls to me.

Nvidia card owners can use DX11 mode and enjoy optimized drivers with great performance

AMD card owners can use the DX12 mode and benefit from the fact that they finaly have good drivers because it's tailored on the gamedevelopers side.

Both party's win

If we are talking game performance the 2600k is starting to slow ya down a bit

>Pascal aka shrunk Maxwell does well in dx11 and shit in dx12
Totally surprising for a dx11 asic, who would have thought.

>It seems like most games coming out that have dx12 also have a dx11 problem.
A dx 11 mode*

So what percentage of games out right now is DX12, and DX12 exclusive, so it has no DX11 mode?

Just trying to make a fair assessment on how big the issue would really be if one of the camps did not fair well in dx 12

>tfw amd is dying
i want amd to fail, and nvidia to become a monopoly. amd are just holding the GPU industry back with their shitty cards. we will enter a new golden age once they die

AMD waitfag reporting in

sorry guys but Raja is just too cute. this little curry gnome stole my heart.

I'll wait for you Raja.....I'll wait for you 5ever.

Small but important detail, this is not the official dx12 patch they are testing but a custom built AMD version of the game.
It attacks the 1080 straight on to the point that it makes it equal to the normal gtx 980... thats how irrelevant this dx12 tailer version is..

Sounds like the sort of thing Nvidia always gets accused of doing, that's pretty disgusting to be honest

>can't even reach 60 frames per second
WHAT A SHITTY CARD

>curry gnome

wont happen overnight. it will be like dx10 and dx11 where games will play on both but have an option for dx11 features and slowly phase out the older DX version.

This time it's different though since we also have Vulkan.

You're right they just give each other the silent treatment.
I dream Sup Forums will do this one day, image it a board without autistic flame wars.

this

a game is made that is *GASP* completely compliant with a standard, and not incorporating shit that drags performance down... the nerve of them.

many AAA games coming out by christmas will have dx12, may not be only 12 due to how many people are still on win 7, but still a large number, and if doom vulcan does well, we may see them slide to vulcan instead and only vulcan, next year is when you will see some of the bigger AAA go dx12/vulcan exclusive, at least non microsoft published AAA, and that's when it will hurt.

nvidia will have close to a year at least on an architecture that just can't handle a modern api, possibly more if volta isn't completely compatible, though i'll be honest, i doubt it wont be, as shit as i think nvidia is as a company, they aren't brain dead.

Bait

>480X will be faster than 1080
>$300 card faster than $700 card

NVIDIOTS ON SUICIDE WATCH

>He's getting a single 8-pin card

So how much would these cost?

The future was 2011 to 2015, when AMD cards successfully held the performance gap in almost every benchmark, every release until Nvidia released whatever X80ti, only to retake it after a couple of weeks with driver tweaks?

Am I the only person around here that actually remembers that the only time Nvidia had a lead in performance until the 980 was when the game made heavy use of their proprietary Phsyx or a fuckton of AA? Am I the only guy that remembers that the fucking Fury and Fury X both outstripped the 980 with driver tweaks and it took the release of the 980ti to bring Nvidia back into the lead, until driver tweaks and cooling solutions from third parties put the Fury/X on par with it in most applications?

Why the fuck are all of you so damned delusional?

That doesn't really matter.

You can't claim to be the Top End™ king of cutting edge technology if you can't actually use cutting edge technology very well.

>Fury and Fury X both outstripped the 980 with driver tweaks and it took the release of the 980ti to bring Nvidia back into the lead
Are you sure you're not suffering from a concussion?
The 980Ti came before the Fury, so how come it took the release of it to claw back the first place from cards that weren't even out yet when it released?

Also do you remember the "Overclocker's Dream" hype, only for the card to maybe if you're really lucky do +50 Mhz on core?

>a game is made that is *GASP* completely compliant with a standard, and not incorporating shit that drags performance down... the nerve of them.

How do you know that? Did you look at the code, or are you just believing what AMD told you?

>AMD provided us with a preview build of the DX12 version of the game, which conveniently includes a built-in benchmark—or rather, it's only a benchmark, so you can't actually play the game. Meanwhile, the retail launch doesn't include a built-in benchmark, at least not right now, meaning there's no way to directly compare performance from the two builds.

>We’re pleased to confirm that Total War: WARHAMMER will also be DX12 compatible, and our graphics team has been working in close concert with AMD’s engineers on the implementation. This will be patched in a little after the game launches...
wiki.totalwar.com/w/Optimisation_Blog

Hmm

Always funny to see an AMDrone being called out on his lies

I honestly don't know what it is about AMD specifically that makes all these cultists so attracted to them.
They take whatever piece of PR spin Huddy utters as the word of God and run with it.

>I honestly don't know what it is about AMD specifically that makes all these cultists so attracted to them.

Its cheap

THAT FUCKING R9 NANO!!!!

Not that cheap. A 50$ difference on a new GPU is pretty much meaningless.

That fucking CPU bottleneck.

Shit, you're right. I could have sworn it was the other way around.

Well, whatever. I'm wrong about the release times for the R9/900 stuff, but what I said about the 7900 and R9 200 time frame, that 2011 to 2015 block, is still entirely accurate.

Nvidia has the edge in sales because of the 970, which everyone bought up in troves because of Nvidia's lies about the video RAM. Then, because Nvidia fanboys are just as stupid as any other kind of fanboy, tons of them just fucking upgraded to the 980, by which I mean they took the offer of paying the difference to Nvidia.


Regardless, I'm not trying to make an argument towards either company being intrinsically superior, because I'm not a fucking fanboy of either of them. I go with whatever is best at the time. I'm trying to highlight this abso-fucking-lutely irrational idea that Nvidia has always been the superior GPU maker, when that wasn't the case for four years straight in recent history.

Was meant as a reaction to

Still cheaper, and people always want to believe they can get a free lunch.

That's why you'll constantly see cherrypicked benchmarks in which a fury x performs closely to a 980ti while in 99% of the games it won't

>which everyone bought up in troves because of Nvidia's lies about the video RAM

You are honestly convinced it was the 0,5GB of Vram difference that made so many people purchase the card? And not the stellar performance/power consumption at the price point it was offered?

Sounds pretty delusional

>but what I said about the 7900 and R9 200 time frame, that 2011 to 2015 block, is still entirely accurate.
That's not true either. Or rather the way you present it is very simplistic.

Maxwell came out in September 2014 and it dominated in sales not because lies about 3.5, but because it offered performance on par with a 290x for 40% less initially, with much better thermals and power draw. It took AMD 9 months to get an answer out.

As for Kepler - it certainly was the more efficient and performant architecture at the time. 680 was slightly faster or on par with a 7970 for quite a while, even though it is a significantly smaller chip.
The high point of GCN is the fact that it was a more forward looking architecture, but it's strong suit didn't come out to play until 2 or 3 years after it's initial launch.
In older (2013 or lower, some 2014) games Kepler is still faster.

Considering I remember the absolute holocaust of anger and backlash that happened that caused Nvidia to offer a sweeping solution of letting people effectively buy two cards for the price of one, yes I am honestly convinced the thing that sold it so well was that people thought they were getting more performance than they actually did.

>it dominated because of price/performance
And a large part of that initial hype was caused by lies.

Also, again, the 680 was slightly faster than the 7900 series, until driver tweaks. Cards last longer than initial release. I spent months doing research and scanning benchmarks before I upgraded from 480 to a 7950. The 7900 series was consistently the better overall performer compared to the 600 series, especially in third-party applications.

My old 7950 with an overclock was still keeping up with my 970, by which I mean it was close enough before anyone freaks the fuck out. I'd love to be able to get both in async with DX12, but the future isn't that bright.


Regardless of any of that, I'm pretty excited for some new architecture. However, I'm reserving judgment until there are both 1) third-party 1000 series cards and 2) an actual AMD gpu out to be benchmarked, at all.

Given that the R9 release was kinda underwhelming, but the Fury's still managed to perform in line with Nvidia's best, I'm not going to just write them off entirely because lol fanboyism.

>yes I am honestly convinced the thing that sold it so well was that people thought they were getting more performance than they actually did.

So the fact that they actualy were getting tons of performance at a low price point, and the benchmarks showed it, had nothing to do with it selling well?

I have to due to my current build.

I think the fact that consumer backlash was so strong that the parent company literally gave away two cards for the price of one says all that needs to be said.

Considering that sales of the card absolutely fucking tanked in the following months after the debacle, I think that also says plenty.

I do think people were running out to buy it for it's price/performance point, but I also think they believed they were getting more than they actually got, and when the reality of the situation hit home it showed in sales.

>but I also think they believed they were getting more than they actually got

That's a really weird assumption though.

People base their performance expectations on benchmarks.

And they got exactly the performance that benchmarks showed with the 970.

It's not like the leak of the 0,5GB changed anything about how the card performed in benchmarks before it leaked.

Also
>considering that sales of the card absolutely fucking tanked in the following months after the debacle,
I think you're just spouting stuff you can't back up, lets see a source for that

Except, no, not everyone bases their performance expectations on benchmarks. There's a reason that the companies slap simple, big numbers on all the advertising for things. The GTX 970 is still branded as a 4GB card, even though everyone who bothers to do their research knows that isn't entirely accurate.

You're assuming everyone does tons of research before they go buy a GPU just because you or your friends of people on Sup Forums do. Most consumers are still uninformed.

If what you're saying was the case, there wouldn't have been the massive backlash from people who bought the card.

kitguru.net/components/graphic-cards/anton-shilov/sales-of-desktop-graphics-cards-hit-10-year-low-in-q2-2015/

Considering the GTX 970 was their number one seller, it stands to reason that was the cause, especially given that once the 3.5 fiasco blew over sales went back up.

directx 12 is a future API, that means nvidia cards will be shitting in future

>considering that sales of the card absolutely fucking tanked in the following months after the debacle,

Rational people do and yes I am assuming people who spend this kind of money on cards also make the effort to at least look up a couple of benchmarks on youtube, seems like a reasonable assumption.

>kitguru.net/components/graphic-cards/anton-shilov/sales-of-desktop-graphics-cards-hit-10-year-low-in-q2-2015/

That's not proof, and not what I asked for, show some proof that after the 3,5GB leak sales of the 970 specifically 'absolutely tanked' like you claimed.

And please filter out the natural sales curve, which will go down after most people have upgraded.

>GTX1080 4 FRAMES AHEAD OF THE 980 AND 2 FRAMES BEHIND THE 980 TI

IT'S OVER, NVIDIA IS FINISHED

When will you manchildren stop playing games?

epic dx12 support nvidia

>Nvidiots still alive and breathing

You would almost have a point if it wasn't 600 USD.

>here's a crazy concept. Why don't you wait until team red releases their shit before claiming they're finished?

Kinda funny how they JUST released 3xx.

Now the AMD drones are crying again because their hardware became obsolete in less than a year.

>Its only okay if nvidia does it with gayworks.

Strange, did the AMD cards suddenly stop playing video games?

>970 is more powerful than 2 970 in SLI
What?

In what shithole country? I bought a 970 a few days after launch and swapped it out a few times hoping to get one without coil whine (which failed and I never ended up getting one at all in the end).

>1080 slide show shows 2.1 Ghz benchmarks
>actual 1080p barely scratches 1800
>it then there's throttles back to 1600s after 4 minutes of gameplay
K E K
E E E
K E K

>admits to buying a shitty graphics card
lol

>just released 3xx
do you like being retarded

>AMD is finished
The reign of AMD has not even begun.