Is THIS really gonna be amazing?

I'm a little cautions Sup Forumsuys, I purchased a 980 for $450 and they're telling me the GTX 1070 is only gonna be $379 and better than the Titan X and two 980's SLI? How can a card cost half of mine and outperform it? Someone please reassure my fears because I am selling my 980 to an uninformed cuck to buy one of these June 10th and it better be worth it. Please assure me this isn't a meme.

>How can a card cost half of mine and outperform it?
it's called the advancement of technology.

But won't NVIDIA be killing their sales for like any other card the sell? Why buy anything else?

No one can. There are ZERO real world benchmarks for the 1080 for at least a week. Nvidia's roadmaps have been floating around forever, it's not like you didn't know it was coming.

they don't sell the 900 series anymore, vendors do. they hosed vendors and hoarders, not themselves.

>Better than two 980 in sli
Only in a specific vr scenario using their new technology.

This is a conference for a launch, everything is about marketing do not take all of what he says for cash, it is always like this at every launch. If the leaked benchmarks are right then this is far from impressive for the 1080.

1080 is better than two 980 SLI, 1070 is just faster than Titan X.

With that being said should I sell my 980 to the kid to buy the 1070?

According to a spreadhseet the only difference between a regular 980 and the GTX 1070 is 8GB of VRAM and has 7.2 billion transistors opposed to the 5 billion on the GTX 980. The 980 ti however has 8 billion transistors, 2816 CUDA cores more than both the 1070 and 980, and 6 GB of VRAM. To me this spreadsheet just made the new cards look like a big meme.

For all we know, they will be. Until some solid real world testing shows up, we have no idea how Pascal will perform clock-for-clock against Maxwell. I'm sure that everyone currently in possession of a 1080 card is benchmarking the fuck out of it, so there could be a leak before the embargo lifts. As it is though, we won't have any concrete info until the 17th.

Only 1080 is faster than 980 SLI

1070 is faster than TITAN X supposedly

> GTX 1070 is 8GB of VRAM
its 7,5gb not even meming

wrong

> Nvidia said it, must be true!

Nvidia has said 100% of what we know about the cards you fucking strawberry. We should just wait until AMD releases the specs of the 1080/70 right?

i doubt they'd do that again. they already know the backlash it received.

>you fucking strawberry
you're off the hook for now

A 8GB GPU cannot utilize all 8GB of its ram. Just like your 1TB storage is 930 GB. not 1024GB.

Also WTF Google?

>A 8GB GPU cannot utilize all 8GB of its ram

nigga what? this isn't the r9 3xx series we're talking about here. these new cards will be powerful enough to use the full 8gb vram.

You cannot use fully your VRAM.

With my 4GB R9 380, it maxed out as around 3800MB. In 8 GB it maxes out around 7600Mb

You're confusing megabytes and mebibytes, or rather the manufacturers are intentionally messing it up to make you think you're buying more.
1 GB = 1000 MB
1 GiB = 1024 MiB

>7.5GB
It has 8, it aboslutely has 8, just the last .5GB are not as fast, that does not make it have .5gb less of vram quit memeing you sperg

are you trolling? none of what you're saying is correct.

if the gpu is capable enough to use 4 or 8gb vram then it'll use all of it. there's no lower limit. cards like the gtx 960, r9 380, r9 390/x can't use their full 4gb/8gb vram because they don't have enough processing power to crunch that much data. that's not saying that if they did have enough processing power to use 4gb/8gb, they will only use 3.8 or 7.6 gb like you said.

4gb = full 4gb can be used
8gb = full 8gb can be used

i don't know where you're getting your info from.

This is OK.

Nigga you must be trolling too. I mean wtf. Do you know vram is essentially same as Ram. And Nigga do you know that I could easily fill 16 Gb Ram with an i5 Ashwell with an Intel Celeron. Here try it: Open as many tabs you can on your browser and check your Ram usage. I guarantee you you will never max it out whatever is your cpu

1TB storage is literally 1000GB, which is 931GiB. It is not advertised as 1024GB (or 1024GiB). None of that is because it can't utilize all of its storage.

go back to Sup Forums

what are you talking about?

>technical discussion about hardware
>Sup Forums

i think you should be the one going.

Pretty sure someone already said the 1070 is split into 8 1GB chips, so it's either gonna be a proper 8GB, or 7GB with 1GB gimped, hopefully Nvidia learned from the 970 fiasco though.

NVIDIA's own graph puts a GTX 1080 at ~20% above a Titan X in pure performance. 180W, 20% above Titan X and $600 MSRP (once custom versions are out) makes for a pretty decent card in today's high-end market. 8GB VRAM and comparable bandwidth to a 980 Ti too, so it should have fairly decent staying power.

GTX 1070 is supposedly faster than Titan X, but since the 1080 is barely 20% faster, GTX 1070 must be faster by a mere insignificant amount, maybe 5% or some shit. I think it's safe to assume it will be essentially equivalent. It's a very tempting offer for $380 if the performance claims end up being true. 8GB VRAM but it's only GDDR5, not GDDR5X like the 1080, so I'd be a bit worried about performance at high resolutions and/or in future games, but we'll have to wait for reviews to get an idea if it's bandwidth starved or not.

In any case you should wait for actual, serious reviews to be out and for AMD to announce their shit too. Based on all the rumors and on AMD's actual claims it would seem like they will not come out with any cards in the high-end segment, so they probably won't even attempt to compete with the GTX 1080, they may not compete with the 1070. Just wait and make an informed decision.

>b-but nvidia said
Trash

>Yes these 4 gig sticks of RAM come with spinning disks
>3.5GB is dedicated volatile memory while the other 512 Meg is stored to the microformfactorâ„¢ rotating magnetic disk
>It still counts as 4Gigs of memory.

>this spreadsheet just made the new cards look like a big meme.
Congratulations, you're more intelligent than the average Nvidiot. Now compare AMD's cards with the garbage Nvidia is shilling and join us.