Is it worth waiting for the GTX 1080 TI?

Is it worth waiting for the GTX 1080 TI?
Is it overkill? Will it break down before it starts going outdated?

Other urls found in this thread:

pc-specs.com/mobo/ECS/ECS_MCP61PM-GM/1456)
techreport.com/news/8547/does-intel-compiler-cripple-amd-performance
techpowerup.com/forums/threads/intel-compiler-patcher-boosts-amd-processors-performance.217814/
twitter.com/NSFWRedditVideo

NVIDIA is a meme company. It's whole existence is based on overpriced dies. AMD has exactly the same access to Foundries but it makes smaller dies because it wants to keep it low cost.

It's easy to meme people with GPUs because they work in parallel by definition.

When you try to do the same in CPUs, that's why why AMD looks bad.

If you're worried about it breaking down, don't be. Nvidia will gimp the drivers long before wear and tear becomes an issue

So there's no really good solution?

Unless you've got some specific niche that only a 1080/Ti can fill, there isn't much point in spending an absurd amount of money on one to begin with

I'm not following the very latest but usually the golden solution is somewhere in the middle. Last time I looked at it I figured that for my 1080p needs an excellent/the best price/performance ratio was an AMD R9 290 that was just being released at the time.

buy whats good now
don't bother waiting for something without a release date, you will only end up disappointed

Nope. Just want to play recent games at 2k 1440p. Also thinking of buying a 6700k, 32gb ddr4 2133mhz, z170-pro gaming. Maybe again, overkill right?

32gb is a total overkill. 8gb is 99% of the time enough so you could go 16gb at a stretch and almost never expect to need more (for gaming).

CPUs play a role but all Intel i7s for the last 7 years are close to each other. Little difference between a 6700k and a 4770k for example.

For 1440p I'd say you'd need something strong but not THE strongest. It depends on the budget of course. But don't go lower than midrange.

Rumor has it that it's been delayed 4 months. During testing it burned down the whole factory and only 1.7% survived, but they made some killer pork shoulders with with wood screws.

Better get faster RAM of lower GB than more GB of slower RAM. e.g. most games will never need more than 8GB or at a stretch 16GB(if ever). That means better give the saved money for faster RAM, not more RAM.

Thank you for the tips.

By the way. Is it me or Intel CPU only support at maximum, 2133 mhz?

If I recall correctly only the overclocking-supported motherboards go 2400 or more but I can't be sure of exceptions.

But it's not a big deal, some overclocking motherboards are cheap enough.

they support 2400 but there are motherboards that can overclock past 3000

It checks out. Thank you

>amd is shit
>blame nvidia

Ill buy whatever second best AYYYYYmd card that comes out in May because i just want freesync and not to be jewed out of my money.

Volta is supposed to be a dramatic architectural bump, while Pascal (excluding GP100) is basically a Maxwell retread.

If you have $1k+ or whatever to blow on a 1080 Ti, go for it, but Pascal will more than likely getting the Kepler treatment in late 2017 or early 2018.

>most games will never need more than 8GB or at a stretch 16GB

When I built my pc recently everyone said 8gb would be enough, but it really wasn't. I experienced a lot of stuttering in games like Watch Dogs 2 and Just Cause 3 even when the framerate was high. I recently got 2 more sticks to get it up to 16gb and that eliminated all of the problems immediately. I'd say 16gb is the new minimum nowadays, unless it's a sub $500 build

i just decided to crack open the old-ass machine i have in the corner to see what new parts it would need to get it up to par for gaming. The motherboard is currently a MCP61PM-GM (pc-specs.com/mobo/ECS/ECS_MCP61PM-GM/1456) that supports a max of 4 GB DDR2. According to your comment, that's basically gg, throw the board out and start over, correct?

A lot of games run fine on 4Gb.

I am yet to see anyone provide actual evidence that nvidia gimps old cards with driver updates.

I am waiting for it, but mostly for it to drive down prices of the 1070

Got better ways to spend money, like on stocks.

It's less gimps and more Nvidia cutting off support right as they launch new cards. Meanwhile AMD provides support much longer which allows older cards to match and sometimes overpower the nvidia card rival of that gen.

My 9800GT still receives the latest drivers.

Try using an AMD HD5000 on Windows 10.

if by "support" you mean improvements, sure. But your assertions about AMD are complete shit.

if the RX 480x or whatever its called is shit I'll probably go with the 1070.

I have a 5770 running W10 just fine user.

My mate's card refuses to work under 10, he had to go buy a new one.

Again, my 9800GT hasn't skipped a beat.

Regardless, I'm still waiting for proof of gimping.

No but wait for Vega

I believe I'm using 8.1 drivers.

>But your assertions about AMD are complete shit.

I'm not saying Nvida stops doing drivers, but really? It's been shown a few times at the very least that AMD cards close the gap on Nvidia cards, The 480 and 1060 are within like 2% of each other now. What the fuck happened to that 10% lead Nvidia bragged about at launch?

>tfw my 980 ti hasnt been memed into irrelevancy as fast as i feared

Im a VRfag so if the new ti is a proper screamer card, ill strongly consider buying it.

>sometimes overpower the nvidia card rival of that gen.

Name a single case of that happening.

>What the fuck happened to that 10% lead Nvidia bragged about at launch?

AMD got their shit together over time? I'd prefer the card that works optimally the day I buy it.

>Name a single case of that happening.

I'll admit to being a fuck up and exgerating, the gaps on cards do close. The 1060/480 one being a recent example. Wouldn't even be bringing this up as much if the card wasn't still $50 more expensive in the states. Even if it kept the lead, the premium was sketchy.

Memes aside you can't argue that Nvidia destroys AMD with how they do driver updates. Nvidia cards will get them practically once a month.

And they don't *usually* break them. When the thunderbolt compatibility driver update came out I could eject my 980ti like a usb drive from my system tray.

How good will it be for deep learning? Between a 1080 and a Titan sure , but cost /perf.

nvidia doesn't cut off driver support period, on top of that you can still download releases without their stupid experience app too

Just buy a Titan X you stupid fuck

AMD does not have the same access to fabs. They're bound by the wafer supply agreement they have with GloFo, which forces them to pay extra for wafers not fabbed on GloFo's (currently inferior) process.

That's for rendering and autocad projects, 1080 is better for playing.

The irony, call someone a stupid fuck and then tell them to buy a titan x to play games on.

List of old nvidia GPU's I used which never shown signs of driver gimping:
* Diamond edge 3D
* Riva TNT2
* Geforce 3 Ti 200
* Geforce FX 5500
* Geforce 6600 GT
* Geforce 8800 GTS
* GTX 470
* GTX 760 (current gpu)

List of old nvidia GPU's I used which were gimped after a driver update:
* None

newsflash: you can play games on it

>Diamond edge 3D
surprised you gave them second chance

>* Geforce 3 Ti 200
>* Geforce FX 5500
what a sidegrade

Different PCs user

There are definitely differences in their deals with the Foundries, of course, but the main reality remains that neither NVIDIA or AMD have their own Foundries and that drives their prices up.

It's not a coincidence every single piece of technology that NVIDIA releases and it's faster than AMD's, it's on a bigger die and aggressively on a higher cost to the consumer.

That reality becomes obvious when you compare a CPU made by Intel, a company with their own Foundries, it would destroy both AMD and NVIDIA on prices.

Intel literally broke the law repeatedly, giving kickbacks to companies to buy Intel and bribing companies to release compilers that don't support AMD. They did it for like ten years.

Wasn't it literally 470s that burnt out from a driver update? The 970 3.5GB is also gimped without any drivers needed. They also fuck laptops out the asshole, I can't use HDMI audio because their DRM doesn't recognize it.

if you like the name, its your wish whether you should buy or not.

>Wasn't it literally 470s that burnt out from a driver update?

Not mine at least.

>The 970 3.5GB is also gimped without any drivers needed
Yes that was a dick move to release it like that. But it was like that from day one and not only after a driver update.

> compilers that don't support AMD

That's utter bullshit. We have benchmarks compiled on GCC on not even running on Windows that shows Intel being extremely faster than AMD per thread for the last ~10 years.

Only a delusional AMD user with regrets of their purchase and cognitive dissonance reaching max would even dare to claim AMD CPUs were amazing lately, per thread especially.

Sorry, but you were a moron, you fell for the AMD CPU memes. GPUs are still best from AMD but that's mainly because of NVIDIA being also limited, both have no Foundries.

use a gtx today with the latest legacy driver and try then drivers that are 2-3 years ol, you get nearly duoble the fps in most games

the same was true for my gtx580 when using now 358.00 drivers instead of the new one got at least 30% more performace

ah shit typo meant ''use a gtx 460 today with the latest legacy driver and try then drivers that are 2-3 years old, you get nearly double the fps in most games

The world is older than 10 years user

techreport.com/news/8547/does-intel-compiler-cripple-amd-performance

and it's still happening:
techpowerup.com/forums/threads/intel-compiler-patcher-boosts-amd-processors-performance.217814/

Companies don't use GCC compilers for their software, they use Intels, because they were paid for it in the past and it has since become part of their toolchain.

I tried that GTX 760 with both 340.x and 375.26.x (now) drivers and I see no perceived difference in performance. If their goal was to push me to a newer card they failed miserably.

But it's still la moronic statement, only coming from spergs that have regrets for their AMD purchase, because we know for a certain fact that several benchmarks compiled with GCC on UNIX also have similar issues.

The short story is that even if Intel favored Microsoft+Intel in some cases a) it is a minuscule advantage when it happened b) it didn't happen in the overwhelming majority of cases because gcc/other.

As said, sorry, but you were a moron you fell for a meme. Excuses won't save you. Intel is not "evil", they just have their own Foundries, which cost a gargantuan amount of money to build.

no its gimping alright, check fallout4 benchmarks. 780ti went from good to piss poor after one nvidia update that introduced useless shitworks functions

I didnt think you would get the point,
Go back to Sup Forums

Barely. Get two in SLI, you pleb.

Waste of money to get 10% more fps.

Sry I ommitted the sarcasm flag

Joke's on you. I doubled your sarcasm.

>Better get faster RAM
you're fucking retarded

>Nvidia uses a bigger die

This is wrong though. This hasn't been true since the HD 7000 series. The 7950/70, 290/x, and fury chips were all much larger than teir competitors. You could argue the fiji die itself was around the same effective size as the one used in the 980ti/titan xp, but the interposer and HBM swells the size of the fury's chip in a way, as the interposer itself is silicon and can lead to a trashed part if the mounting process doesn't go perfectly.

It's only with polaris that AMD has shrunk their die area back down, but vega is going to be a massive fucking die with all the same trouble regarding interposers and HBM the fury had.

I'm hoping Vega is as big a hit in the GPGPU market AMD is building it to be so we can get a more traditional gaming-centric arch in the near future. They just need that little bit of extra capital and manpower to reasonably support a 3rd arch.

Bullshit, my brother has a 760 and my 750Ti (Maxwell) outperforms his in everygame now. You are delusional.

(X) Doubt

Hawaii and Tahiti were both around the size of its competition, which were GP104 and GK110.

GK110 was fucking enormous, there's no way Tahiti was close to its size.

...

If you cover your ears and pray to God the problem will go away.

>bribing companies to release compilers that don't support AMD. They did it for like ten years.

Wasn't it on a Cyrix cpu that people found out it would cut processing time almost in half with certain software just from faking a Genuine Intel identifier?

That never happened.

It did, I just don't remember what cpu it was exactly.
IIRC it was some random dude that went and masked the cpu identifier in windows for shits and giggles, and found the performance difference by chance.

Nobody does that 'by chance', it's obviously fabricated

>Practically once a month

AMD does also user, sometimes more than once.

...

>amd drivers are shit at launch and need 2 years to mature
>nvidia drivers at good at launch and can't get much better over time

still not gimping

It's also that they don't bother to optimize new games on old hardware, whereas AMD has stuck with an architecture so long that 4+ year old cards are largely benefiting from optimizations meant mostly for newer cards.

TeraScale optimization efforts ended a while ago, but GCN support will have stuck around much longer than Fermi/Kepler thanks to Polaris probably being sold into 2018.

all true
still not nvidia gimping their older cards

fUCKING POOR FAGS

titan xp or bust

Volta is coming at the end of 2017, 12nm based chips

Is it niche to want to run newer games at over100hz~ across three 1440p monitors?

I see this as the only leap to a better experience from my current setup.

Agreed, Nvidia doesn't gimp old cards, they just misrepresent what their products are to the point of borderline fraud.

They portray their products as being top-of-the-line hardware, but in reality you're primarily leasing the attention of their enormous driver optimization and dev consultation team for a year or two, and the raw hardware is arguably underpowered in significant measures compared to AMD.

Nvidia is great if you plan to upgrade every 1-2 years (and can successfully pawn off old cards on eBay before people realize they're now lemons), but you'd be an idiot to believe they're good long-term purchases.

UHD@120/144Hz is coming this year, user.

Nvidia cards when released allready present close to 100% of their performance capabilites.

AMD needs time.

That's where the meme comes from with AMD cards aging like old wine.
I would be pissed to have a potent card but the drivers are so shit that it takes a year or so until it can translate into more performance in games.

DX12/Vulkan will shake up the old regime, since it forces game engine devs to put all optimizations in the game (and hence available to all users) rather than having Nvidia/AMD bundle rewritten shaders for every AAA game ever made inside the drivers.

It speaks well for Nvidia's driver team that their DX11 speeds usually match DX12 performance, but it's also a sign that they're going to lose their edge of premier DX11 performance in the long run.

>vulkan
DOA
>dx12
it's going the way of dx10

> all this damage control

both new API models are really for the main engine vendors, of which there are relatively few, all of which are already doing vulkan and/or dx12 versions.

Nvidia will give a lot more support in the future, assuming they finally have a non-garbage implementation of async compute in Volta.

All the different approaches basically lend to the same thing: overlapping post-processing on one frame with the geometry/shadow map rendering for the next, which you can do pretty easily in vulkan/dx12 or more painfully (but still possible) in dx11 shader rewrites.

>this API exposes how shit my company's products are, better call it DOA since i dont know anything about it.

>3 games
sorry my man, it is DOA
I know vulkan is great (for both nvidia and amd cards btw), but its the truth

Nvidia won't drop the 1080Ti until Vega is out.

>muh vulkan is only for gaymes

thats the profesional card

your gaymer cards wont be out till 2018 and will be followed by refreshed Vega.

>Nvidia is great if you plan to upgrade every 1-2 years (and can successfully pawn off old cards on eBay before people realize they're now lemons)

so much this. Ive bought several AMD/ATI cards that are even better then when when i bought them and all of them still work.

I have had two kepler cards and a maxwell card fry in my system. Also the Keplar cards went completely to shit after maxwell 900 series released. Literally all the games released post maxwell had striahgt up criminal levels of non-optimized drivers.

Hell even a 700$ gtx-780ti its getting fucking stomped by the R9-290X in modern titles. That alone should tell you all you need to know about the longevity of nvidia. `

Already the performance of Polaris is above what I expected. It kicks the shit out of the 1060 at 1440p and it even seems to run some nvidia optimized games better. With a 8gb 480 that has a strong overclock you can actually run The Witcher3 with Hairworks on (high settings 4xAA or Ultra settings 2xaa) and a Tesselation limiter set to X16 and you can keep high 50's up to a 60fps lock. The 1060 only gets a 60fps lock when Hairworks is completely off. Even a damn 1070 cannot maintain a perfect 60fps lock with hairworks on max settings.

Its ironic and enraging that all the "reviewers" all of the sudden run benchmarks with Hairworks disabled by default. These same shady fucks already used Hairworks to gimp AMD in the past and all benchmarks had HW on but now that Polaris actually handles hairworks better than its equivalent Nvidia card they keep that setting turned off for bench marking because it makes nvidia look extremely bad.

it literally is

>Volta is coming at the end of 2017, 12nm based chips

protip: TSMC 12nm = 16nm v2 + marketing

It also supports android for phones as well. Its completely open source. If you have the time and skill you can make it onto any platform.

again, I know vulkan is great
that makes no difference though

should i grab a 1080 now or wait for this shit?

>want to play games on max settings
>decide to get 1080
>this is coming out soon
>dunno if i should wait

My 1070 is doing 66-70fps in white orchard exterior, ultra with 8x hairworks aa bro. Witcher seems poorly optimized to me, I don't see why people even use it as a standard bench.