So I'm looking at the 1080 benchmarks

And it seems like it's literally a 980 ti with 20% less cuda cores/TU/ROPs but is over clocked 60%

How did they manage to get it to overclock so much?

Other urls found in this thread:

betanews.com/2010/03/18/nvidia-admits-geforce-drivers-responsible-for-fan-problems-issues-updates/
modcrash.com/nvidia-display-driver-damaging-gpus/
wccftech.com/nvidias-latest-game-ready-driver-allegedly-killing-gpus-plagued-issues/
bugs.chromium.org/p/chromium/issues/detail?id=350547
techreport.com/news/28551/nvidia-353-38-hotfix-driver-fixes-chrome-crashes-g-sync-lag
techfrag.com/2015/07/25/nvidia-geforce-gtx-980-ti-owners-unable-to-get-windows-10/
forum.notebookreview.com/threads/windows-10-nvidia-whql-drivers-are-killing-alienware-and-clevo-lcd-panels.779449/
techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/26.html
twitter.com/NSFWRedditGif

By letting it burn down your house

Massive process shrink. The architecture is literally the same though.

Is 1070 benchmark out yet?

Count percents for clock cuda cores/bus, rops etc between 1080 and 1070. Take average = 1070 performance. It will scale down perfectly clock by clock.
Since bus is narrow GDDR5x vs GDDR5 is not a huge factor.

I have a 980, shouls i get a good aftermarket ti when the price drops or see if amd has anything

By removing the extra bulky cuda cores and using a stealthy radar-deflecting body it allows the card to overclock without the 980 ti finding out.

Top kek

No real reason to upgrade unless you wanna do 4K or 1440p @144hz.

See New processing node. Previously they were using 22nm, and now they're using 16nm. The smaller the processing node, the more power efficient silicon gets. I think Intel's down to 10nm these days or something.

>How did they manage to get it to overclock so much?
die shrink

then they'll roll out the big pascal at even higher price with a big jump in performance and stock clocks

By gimping the already gimped 980 ti, Nvidia can deliver groundbreaking improvements over last generation cards, every single generation!

Anyone think the new cooler design looks shit and edgy and prefer the 980 design better? Because look how edgy that motherfucker is, I dunno if I should overclock it or give it a box a tissues and tell it to stop thinking no one understands it.

But user, don't you love having more polygons? Don't you like being invisible to radar?

Nvidia keep fucking shit up recently with design choices and drivers, this is why I stick with AMD now.

>nvidia keeps fucking up with design choices and drivers that's why I stick with AMD
>design choices and drivers
>stick with AMD

>New processing node. Previously they were using 22nm, and now they're using 16nm. The smaller the processing node, the more power efficient silicon gets. I think Intel's down to 10nm these days or something.

22nm ->28nm
10nm ->14nm

Premium components and craftsmanship

More transistors of smaller size allows a much higher effective memory clock rate and thus, much higher memory bandwidth with much higher performance per Watt. Clock it faster until it overheats.

bump

They were using 28 nm, not 22 nm.
Only Intel got access to 22 nm, no discrete GPU did it.

Pascal is built on 16 nm, not 14 nm.
Polaris is the one that will be 14 nm.

My mistake, sorry.

>Pascal is built on 16 nm, not 14 nm.
>Polaris is the one that will be 14 nm.
I didn't say anything about 16nm.
I corrected him on 22nm and 10nm.

>nvidia drivers kill GPUs in three occasions since fermi
>nvidia drivers caused crashes on chrome that went unresolved for 8 months
>nvidia drivers failed to support some GPUs for over a month after the release on windows 10... and those were expensive current gen GPUs, not old ones
>nvidia drivers gimp performance of older gen cards, nvidia cards that previously beat the competition now soundly lose to them

>"GUISE, AMD DRIVERS ARE BAD, AMIRITE? XD"

Why are there less cores? What limits the amount of cores a GPU can have?

Maybe the 16 multi-projection view ports (15 more than anything else) took up some extra transistors that were laying around.

>all these made up bullshit memes made by AMD fanboiis
Haha okay kid.

>Why are there less cores?
You mean "fewer".
There are fewer cores because this is GP104, and you're comparing it to GM200, which is larger. You should be comparing it to GM204 (the chip used on the 970 and 980), and in that case it has more cores as expected.

The question you should be asking is why did Nvidia decide to cuck their customers and offer the smaller GP104 for the prices they should be offering the larger GP100 for, and still got praise for fucking their customers over.

Just for comparison, the GM204-based 970 launched at $330. It's replacement, the GP104-based 1070, will launch at $450. That's $120 more for the same class of card.

So much denial, it's hilarious.

>killing GPUs
2010: betanews.com/2010/03/18/nvidia-admits-geforce-drivers-responsible-for-fan-problems-issues-updates/
2013: modcrash.com/nvidia-display-driver-damaging-gpus/
2016: wccftech.com/nvidias-latest-game-ready-driver-allegedly-killing-gpus-plagued-issues/
There was even another one that ahppened to some 800M series GPUs as well, but I can't seem to find mentions anymore, but the three listed should suffice.

>chrome crashes
Documented here in march 2014, first mention I could quickly find: bugs.chromium.org/p/chromium/issues/detail?id=350547
But if you weren't such a newfag you'd have seen starting early 2015 that every week there would be at least one thread here of people complaining about this.
Fixed only in june 2015, over a year after they first found the issue on chromium: techreport.com/news/28551/nvidia-353-38-hotfix-driver-fixes-chrome-crashes-g-sync-lag

>windows 10 support issue
techfrag.com/2015/07/25/nvidia-geforce-gtx-980-ti-owners-unable-to-get-windows-10/

>gimping
Conveniently shown on another thread here

They're pretending that there won't be a consumer GP100 card.
Which is bullshit, of course. The 1080 Ti and/or Titan XN or whatever will have it.

Rekt

Which of the 1080 and 1070 will have the best value for money? I neither want to be a cheapskate nor a spendthrift

You forgot about the recent cases of laptop screens getting rekt caused by the nVidia drivers wrecking the EDID data in monitors
forum.notebookreview.com/threads/windows-10-nvidia-whql-drivers-are-killing-alienware-and-clevo-lcd-panels.779449/

Thanks, user.
I forgot a bunch of stuff actually. Did you remember when Nvidia promised people overclockable laptop GPUs, people both them, and after getting their sweet sweet cash Nvidia just went "lol, JK, this new driver will take the overclocking you paid for away from you!"

Also have this question.

Yes, and the backlash that was involved. Good times.

>Good times.
And of course the 3.5 GB disaster. It was absolutely beautiful to see all the buttblasted 970 owners here bitching endlessly about their shit cards. Having a selection of at least 12 "970 3.5 nvidia fucked me over I'm so mad" threads at any given time to pick from and go laugh at nvidiots. Specially the ones that after being caught in the worst GPU launch disaster ever, still wanted to go give Nvidia even more money (almost $200 difference) to upgrade to the 980.
I think the only event more polarazing than that in recent history was the 7x1. I sure hope Nvidia fucks up again this time so we can relive that.

>mfw even after all this there were still retards unironically suggesting others buy that piece of shit 970 3.5 GB stutterfest edition

its a failure

i was going to get one of these, but, looking at reviews it seems i might have to wait a little while longer

why?

well from what i've seen and read this thing will heat your room up to unbearable levels (it's summer guys) then theirs the heat getting trapped in the backplate,
then one thing someone mentioned, which has put me off buying a founders edition is the vrm, apparently someone said nvdia skimpt out on it

so i'm going to wait a few months to see how many die or start getting artifacts, crashes and other problems

i'm in no hurry desu

vid related

We should really have an Nvidia shitlist with sources. This is a good start.

To add:
- ridiculous tessellation
- woodscrews
- housefires
- 3.5

to be honest, if i were you, i would wait until they bring out the cards with after market coolers (and potentially better vrm's for overclocking

>4 gpu's get killed by drivers
>literally millions of people have Nvidia cards.
>all the broken down AMD cards are not counted

AMemeDrones at it again

FURY X MASTERRACE

But I kind of need one now. Will the resale value on the 1070 be decent enough that I can roll with it for a while do you think?

Or I could get a second hand card of an older generation for the time being perhaps.

>Previously they were using 22nm, and now they're using 16nm.

16nm FinFET as far as I'm aware, which has a massive advantage for power efficiency above and beyond the process shrink, so they can push the clocks much higher and still keep the same thermal profile.

There's a few architectural design updates, but the new process is the biggest difference.

The reality is Maxwell was a very efficient architecture for 28nm.
If you look at per watt efficiency compared to other 28nm chips, it was the best design on that process.
After Maxwell released Nvidia was more worried about their own chips competing with their next generation than anything AMD could feasibly offer within the year.
If you paid close attention to Nvidia's statements and behavior you would realize that the leap to 16nm was very rough and did not get launched on time with their expectations.
>The "paper launch" fiasco, and them taping out Pascal months later than they had planned.
Maxwell was released with very low clockspeeds compared to what was possible with the architecture because Nvidia was uncertain about their timeline for 16nm products and AMD had no competitive offerings when it was released.
The bizarre range of factory speeds on Maxwell made it apparent the cards were scaled back.
The cards for the 980ti in particular varied from a reference of 1076MHz boost speed, all the way to more than 1300MHz for partner boards.
The 950 chips showed this, since they were Maxwell chips running at 1350MHz+ from the factory which was done because AMD had competitive offerings in that price range.
If you look at the competitive overclocking benchmarks for Maxwell it becomes apparent that the chips could have been released at much higher speeds, but weren't.
Competitive benchmarks were commonly stable at 1.6GHz to 1.8GHz, and up to 2.2GHz was possible.
However, with Pascal, Nvidia is actually expecting performance against AMD's offerings to be close and the early attempts at overclocking are leaving the cards at much closer to factory speeds than Maxwell on release.

>TLDR
Maxwell was the best 28nm design.
Nvidia actually released Maxwell gimped because they didn't have competition.
A new release has to beat their own generation in performance (980ti).
Pascal is running close to it's real limits (before big Pascal) already to achieve this.

Fucking rekt famalam

>4 fucking fps more
>$400 fucking dollarydoos

kek

Even better its only against a moderately clocked 980ti - most 980ti's go to around 1450mhz which would close the gap even further.

>amd not competitive with the 950
r7 370 and you are retarded if you think amd isn't competitive at the midrage and low end

>which was done because AMD had competitive offerings in that price range.
Reading comprehension.

Ironically that's only considered bad when you don't compare it to AMD.

I'm hijacking this thread.

I just read that Total War Warhammer is going to have DirectX 12 support.

I'm currently using a GTX 780 and according to what i've read its supposed to support DirectX 12, atleast partially.

I'm wondering if i'm going to be able to see any sort of performance boost from using DirectX 12 with my GTX 780, or if i should just stay with Windows 8. I only use windows for games so i dont care about botnet etc.

no the 780 is gimped just get polaris or wait for vega

>Or I could get a second hand card of an older generation for the time being perhaps.
considering the 1080 and 1070 memecards will be sold out for months go get a 980ti used

>GTX 780

getting a old 290X would be a substantial upgrade even more so in directx 12

see >Implying that the 980ti wont get gimped too

there is literally no need to go to newer drivers

Is NVIDIA getting worse or AMD getting better? Or both?

>get a new card

I will in time, but i own a 780 and i'm asking about the 780.

you forgot its 3x hotter

nvidia will still make billions off brainless nvidicucks despite every benchmark showing NO change in FPS

That is impressive, only Nvidia can put gamers first!

Both

The only benefit a 780 is likely to see from DX12 is the reduced cpu overhead - something that is particularly crucial for a game like total war.

both

The thing has almost no DX12 support, no async, ect. so you will not get shit out of DX12

Would a jump to pascal from a gtx 980 ti be good if the only thing I use it for is adobe premiere pro? Currently I'm having trouble online editing more than 2 layers of 4k 24fps at a time.

Might as well because nvidia is just going to gimp the 980 ti like they did the 780 ti or just get polaris.

Open cl for Adobe is super buggy so far, I actually switched from amd to nvidia for that reason. Fucking 4k is a meme right now tbqh but thats what everyone wants. So that's what I have to use.

yeah because nobody wants to game at higher resolutions with high framerates

If amd drops the ball with Vega we are going to be fucked
Nvidia will be like Intel, giving us 10% upgrades for the next 5 years

>290X
>upgrade over a 780
Why would someone do that, just go on the internet and tell lies?

Because they're not. The 290x is a better card in more games than the 780. There is no way to dispute this. The 290x also scales better at 1440p.

780ti you MIGHT have a point, but the 780 is completely irrelevant

gddr5x in combination with process shrink.

even the 780ti has fallen 10-15% behind the r9 290x

techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/26.html

Make your choices Sup Forumsents.

The shocking part of that is the 780's closest competitor from AMD is the 280x (aka 7970 ghz) - a card meant to compete against the 680/770.

Damn even the nvidia shill sites dont even hide how far behind the 780 and 780 ti have fallen

Or the fact that the 970 is basically slower than hawaii across the board.

>a collage of nvidia shilling sites

is amd ever going to overcome the popular perception of them as manufacturing inferior, unreliable products? :(

The board partners that make them just don't put as much effort and quality parts into them as they do Nvidia stuff because the market for them is much smaller.

Not only that, it throttles after 4-10 minutes into gaming. These OC clocks of 1800s goes down to 1500-1600s

Its pretty hilarious.

How long will it be before the non founders edition hits the market?

"""""June""""""

Ya I am one of those people that got lied to about the 970. I'll been waiting to see if amd has something comparable.

If you mean performance then the 390 is better then the 970 and polaris will be as well

The 1080 doesn't do that though. I'd say wait for 1080Ti or Greenland.

970 is THE card for 1080p

Yeah which is unfortunate when you consider the 390 is a really good 1440p card.

1070 is THE card for 1440p

they both belong in THE trash

I see where this is going. For 4K it's best to wait for 1170 instead of paying the early adopter tax for the 1080 Ti.

So, 1080ti when?

Sticking with the x70 series for whatever applicable monitor resolution you are using is the most logical way to upgrade.

I just want a 700 series card that will not to expensive because it's not as recent. Should I just go 780?

No.

Have a 780, take my advice and stay FAR away from it.

>stay FAR away from it.

why?

I need context of this webm. What would give the right for someone to brandish a weapon on someone that is debilitated by a door?

Did you read the above posts. x70 is for you if you want the most bang for buck.