Let's be honest now, will Vega be able to actually compete against the 1080ti?

Let's be honest now, will Vega be able to actually compete against the 1080ti?

Didn't the benchmarks show that Vega basically beat 1080 by a low margin?

I want a new GPU since mine is getting old (6950), and I planned on waiting for Vega to upgrade. But the way it seems right now, AMD won't be able to compete, and you might as well just buy a 1080ti now and not have to wait.

Can we have a proper discussion please?

Other urls found in this thread:

youtube.com/watch?v=XOGIDMJThto
twitter.com/NSFWRedditGif

When is the last time amd composted when it came to the highest speeds. There goal has always been price/performance.

The last benchmark they ran for VEGA, it wasn't even running its own drivers. It was running hacked up Fury drivers.
Wait for Vega to at least get some proper benchmarks out first, because if it even comes close to the same performance of the 1080Ti, it might turn out to be a much better buy in terms of the price.
If its within 5% performance but costs £/$100 less, I'd much rather get the Vega over Pascal myself.

2012 and arguably 2013

Vega 10 is a 4096 shader design running above 1500mhz, with a brand new shader architecture and a completely redesigned scheduling system that relies on finely distributed control units to avoid scaling problems beyond 3000~ shaders like GCN had.

amd is fucking finished

>Implying they didn't re-release the Titan X(p) because they were scared of Vega

Please go away. Let's keep this civil.

Titan X P was just to get retarded children to buy defective Quadro P6000s

1080 Ti is the pre-response

they not going to sell it for 400, they called it vega=>500-600

so they either have to give giant performance or hang themselves

honestly worried, you either go 1080ti now which becomes deprecated in 1.5 years or overpay for constant but not highest of high end performance

can anyone compare Tflops numbers between titan and instinct slides?

So it's gonna take devs years to optimize for it

I am keeping it civil, its a perfectly rational response. AMD are set to release something that, with hacked drivers, beats out the 1080. Nvidia wants to be performance king, so what are they to do? Re-release their strongest consumer card and drop the price.

Exactly what I thought. This also seems to be the case for Ryzen in a lot of games.

I'm a big fan of AMD products, (I have an entirely AMD PC with two RX 480s) but I just don't see it. AMD have a history of providing honest but cherrypicked benchmarks that display their products in the best light they can and if that's true here... That Doom benchmark just isn't enough, a minor improvement over the GTX 1080 a full year later, with a giant GPU die packed full of presumably expensive HBM2 memory? I don't see how it can compete on price and I don't see how it can compete on performance.

I hope they provide, I don't know, super charged Polaris or Vega with the stupid memory chopped off or something that will compete in the gaming high end and not just go for the datacenter, but looking at what they've shown us I'm not hopeful.

Nvidia always release titan and ti cards. AMD is a fucking joke at this point. Nobody will bother with expensive vega because of HBM2 when you can get 1080 for cheap or wait for volta.

It's AMD support will always be spotty.

Look what happened to vulkan and mantle

>Nobody will bother
If its a worthwhile upgrade from my Fury X, I might, thought it has to be a very worthy upgrade because I'd also have to get a custom waterblock for it too.

AMD fag here I have a 390x and I love it to death but if they can't match or exceed the 1080 or 1080ti for a cheaper price I'm making the switch

we dont know how vega will compete because it has nothing to do with the previous gcn uarchs

before we could estimate a perf based on past exp

but now we simply cant.. and the fact that raja told us that the base clock will be 1500mhz says a lot

Instinct had 25 TFlop FP16, what games actually use, FP32, would be 12.5 TFlop, this works out to 1536~mhz at 4096 shaders. No idea if a consumer card would clock higher, but typically AMD workstation cards are 10% lower clocks than consumer counterparts.

1080 Ti has 10.6 baseclock and 11339 at average boostclock. Typically Nvidia is more efficient per teraflop but the variable width SIMD units on NCU might make a difference.

> Didn't the benchmarks show that Vega basically beat 1080 by a low margin?
Vega isn't out yet, it's pointless to compare and hype it. When it's out, it's out.

I have a 390 as well. I'm contemplating getting a 1080 now or waiting until April for the aftermarket 1080tis. Also whether I should wait for a new 1440p 144hz ips monitor or getting the Acer xb1. Thoughts?

Might get a 2nd gen ryzen CPU and 1180 next year I'm sick of waiting for GPUs from AMD that don't deliver going on 3 years now.

they got really good with GCN drivers recent couple years, still probably will fuck something up yet again
ie 480 -solid hardware, rushed release
ryzen - very solid hardware, fucked up marketing

It's unlikely that anything Vega will beat the 1080Ti, that card is a fucking monster.
However, there may be some kind of 1080 equivalent card at a hundred dollars or so cheaper than the 1080. That's what I'm hoping for.

But knowing AMD, we'll be seeing a 1070 equivalent at 1070 prices.

Nah all that shit I said is driver and microcode side. The memes that devs will have to optimize for are primitive shaders (can be done driver side as well) and the HBM cache, which basically treats the HBM as a giant L4 cache, which works because it's low latency.

Ryzen's primary issues aren't exactly with games themselves, but with bad launch BIOS and Windows 10 being an eldritch horror of barely functioning code. Windows actually thinks the god damn thing has something like 128mb of L3 and keeps allocating across CCX's which increases cache latency by a third or so. Games are very cache sensitive. The core parking and SMT scheduling issues are secondary to that.

The first Titan is only 4 years old.

what is worse 3 year titan gets beaten by modern 200 cards

...

>fucked up marketing
This is bullshit. They consistently benchmarked it against an eight core workstation chip in professional workloads, then briefly mentioned "It's also good at games" and everyone went "YOU TOLD ME THIS WAS A GAMING CHIP!".

GP102 is actually not that large, 471mm2. Now GM200 on the other hand was 601mm2.

In a year or two, probably Volta, there'll be another 600mm2 range monster.

Well considering the 390 and 390x lick the 1060 6gb oc and 1070 stock respectively it's much of a muchness and you need a very fast 8 core or better i7 or Zen to get the most out of a 1080ti at 1440p or below

Get a 1080 oc
Some games will never run right on AMD hardware though drivers aside I'm just sick of fucking waiting for AMD GPUs that never Come.

If they fuck up the pricing and marketing which they did with Zen with Vega I'll just get a ryzen 1800 and a 2nd gen am4 Mobo and shove a 1180 in there next year

Original Titan would actually be around the 280X, which is a rebranded 7970 that was originally weaker than the 680.

Kepler architecture aged like soggy milk.

It doesn't really matter because AMD can't even compete with their low priced cards.

It could've been handled better. Youtube is plainly evil, they should've not send them samples at all - want views buy them.

600mm2 at 10nm

don't know if that would work

Lol ironically the 970 is one of the worst cards Nvidia made yet it's the most popular.

Did the same thing with the ti with shit memory bandwidth and 11gb wew

Isn't there 7nm?

It depends how much AMD have improved GCN's performance per clock in Vega compared to Polaris. What we think we know about Vega so far is:
>It will run at 1500Mhz~ with 4096 shaders
Assuming that:
>Performance increases from clock speed scales linearly
>It has the same number or ROPs or the same ROP throughput as Fiji (Fury X)
If these things are true, and performance per clock is exactly the same, Vega will be 50% faster than Fiji, which will put it's performance a bit higher than the 1080, but slower than the Titan X Pascal. This can be considered the worse case scenario for performance. (It also lines up with the performance in nu-Doom AMD showed off, but we don't know if that engineering sample was running at full clock speeds.)

However, there are a few things to consider:
>Fiji had a 1:1 CU to ROP ratio, (64 CU, 64 ROP). This isn't optimal, and caused the Fury X to be ROP bottlenecked. This is demonstrated quite well by the image. If Vega has more ROPs or a higher ROP throughput, then that could also increase performance.
>Polaris had some minor performance per clock gains vs older GPUs, these need to be taken into account
>Fiji's memory controller couldn't effectively use all 512GB/s of memory bandwidth HBM provided, Fiji was actually memory bottlenecked in a lot of situations. Vega should resolve this
>Vega has a lot of minor a Nd architectual changes which we don't know the full performance impact of. One of the biggest is implementing hybrid tile based rasterization, similar to what nVidia uses on Maxwell and Pascal. Tile based rendering is a lot more efficient
All of these improvements could potentially add up and allow Vega to be as fast or even faster than the 1080ti... Or they might not

>TL:DR: We haven't got a fucking clue

TSMC 12nm, like some faggots meme that Volta will be on, is just an optimized TSMC 16nm process that they're renaming because Samsung/GloFlo named theirs 14nm which is smaller than 16nm.

In reality both have a 20nm backend so it doesn't fucking matter.

Samsung 10nm has a 14nm backend btw.

pricing is fantastic for zen, what are you on about?

>CPU bottlenecking 1080ti
Yes after looking at benchmarks, I was fearful of that.
>Using a stock clock i7 4790k and a 1080 oc'd
How would that fair with 1440p 144hx gaming at very high, ultra? Any aftermarket card in mind?

desu tile based rendering isn't as big a deal as the scheduler changes.

Of course this is all cocktease for Navi. If Nvidia fucks up with Volta and doesn't do MCM then welp.

It was insane price/performance at the time. I had an AMD card die on me around that time and I hunted high and low for a competitive option on both sides of the fence, there simply wasn't anything.

FINE WINE

but seriously though I don't mind by hardware aging well. I don't buy a new graphics card every other year, only when an upgrade is actually needed.

290 was cheaper and faster

Unless you're a dirty un-american communist.

I get proper OGL drivers with nvidia.
inb4 vulkan
Vulkan is not a replacement of OGL for serious applications, you fucking gaymen faggots.

Vulkan is not an AMD product, it is an open API defined by the Khronos Group, an industry consortium. It just happens to be based off Mantle.

Everyone was saying that at the time, but I couldn't find anywhere that was true. I think maybe people had already been through and bought them all out (many of them were out of stock or in low supply) or it was something to do with my being in Europe.

As someone who has an i7 4790k + Fury X @ 2560x1440~144, it depends on the game, but a 1080/Ti should handle it quite well.

You could struggle to get over 100fps but it would be a GPU bottleneck in most games especially with slow ddr3 RAM
My 390x only struggles in siege but I know it's a RAM and CPU bottleneck and ubisofts shit optimization since it's only spasmodic fps drops and only on that game

Legitimately curious, what are these serious applications of OpenGL that aren't gaming? It's a fucking graphics API.

Bonus question: What's wrong with AMD's OpenGL support? I've developed two OpenGL render engines on AMD hardware, never encountered a problem.

DDR3 vs DDR4 isn't that big a difference in most games.

Fallout 4 is not most games.

He's a Linux hipster

I did it on Linux! I'll take the AMD drivers any day over that garbage NVIDIA calls Linux support.

Ubisoft optimization is non-existent

If he was then he'd be riding AMDs dick right now.

Coreboot confirmed, finally the secnuts can upgrade from Core 2 Quads.

I must admit the rare times I dualbooted lincuckx amd had better drivers desu

What monitor do you have? Acer or Asus?

Acer, XG270HU freesync.

It's because they open sourced it.

Nvidia still keeps everything under chastity cage and fucks the open source guys. For fucks sake they even required all firmware to be signed recently so anything Maxwell and newer only runs at full clocks constantly on the open source drivers.

Any dead pixels? how's the Blb?

390x here, same boat. have a 1600p 60hz and looking to up the refresh rate to 100hz+

No dead pixels, uniform lighting so no blb issues, viewing angles are perfectly fine so long as you aren't retarded and expecting to use it from a non-standard angle.
I've heard the colour calibration needs some work, and I noticed it looked different to my old screen, but its not something I've had issues with as such. Doesn't bother me at all but you'd need to get a calibrator if you were looking to do colour perfect work (but why would you on a TN?).
The ONLY issue I find is with Freesync, because its an older model monitor, the freesync has an issue where if you go below 45fps (I think that's the limit for this monitor?) or it could be 30, it starts to flicker -- and once that starts, you often need to tab out and tab back into your game or whatever to stop it from flickering white.


But yeah, I ordered the monitor way back when and when I first got it, I'd heard some horror stories and was really worried, so I put it through its paces straight away but found no issues. Its now been something like three years I guess and I've not had any issues arise either, so I'm happy.

>7000 series destroyed by 500 and 600 series
>200 and 300 series destroyed by 700 and 900 series
>400 destroyed by 1000 series in everything except DX12.
>NVM lol
>VEGA WILL BEAT NVIDIA YOU WILL SEE!

And that's with objectively comparing inferior AMD hardware. Even AMD releases a solid product (like Ryzen) they still struggle to sell it because of how everything I mentioned above and more ruined their reputation.

what about when linux kernel dude threw hissy fit about drivers not being up to standard or something, it got resolved?

I meant performance-wise.

and here are actual numbers

>7000 destroyed by 600 series
Miracle patch lol, AMD had the performance crown there for roughly a year. Power consumption was near identical as well

>200 series destroyed by 700 series
290Xs were sold out for six months straight. There were scalpers raising the price to $800

>300 series destroyed by 900 series
Fair, AMD bet on 20nm and lost.

>400 series destroyed by 1000 series
Only at the high end. 470 is value king. 1060 3gb is a cuck card, and 1060 6gb is too overpriced to compete with the 480 8gb. Normies gonna normie though.

>7000 series destroyed by 500 and 600 series
Was it? That's news to me.

>200 and 300 series destroyed by 700 and 900 series
EVERYONE was shilling the 200 and 300 series back then, I can't remember the benchmarks but it sure as shit didn't seem like a stomp, people don't usually shill AMD hardware.

>400 destroyed by 1000 series in everything except DX12.
Largely by more expensive cards.

>VEGA WILL BEAT NVIDIA YOU WILL SEE!
Probably not sadly.

>Hitmang has 13% improvement
>slide says 23%
>average 16%

VEGA will be at BEST 1070's level.

please tell me how 7000 was destroyed by 500 and 600 series

>hd7970, 230w tdp, 50,7 fps in Crysis
>gtx680, 195w tdp, 51,1 fps in Crysis

gtx580 is not even fucking close to these two, but let's compare it to hd6970
>hd6970, 250w tdp, 37,5 fps in Crysis
>gtx580, 244w tdp, 38,5 fps in Crysis

let's go older
>hd5870, 188w tdp, 33,8 fps in Crysis
>gtx480, 250w tdp, 32,8 fps in Crysis

even older?
>hd4890, 190w tdp, 21 fps in Crysis
>gtx285, 204w tdp, 21,3 fps in Crysis

Real answer is no one knows yet.
But the 1080ti releasing this early with such a price drop leaves me suspicious. It's still not cheap in the slightest mind you, but Vega is still a couple months away at least, and with the leverage they had with the TitanXP pricing, they could have priced the Ti higher and get away with it, but they didn't. Leaves me wondering if Nvidia knows something about Vega we don't.

Judging by amount of shaders and clock speeds Vega will be AT LEAST at 1080's level.

It's already been demoed above that.

and with real benchmarks it will be shit, it's always like that.

...No it isn't. AMD's Polaris and Ryzen benchmarks have been recreated in the real world, they are truthful. AMD cherrypicks aggressively to show their products in the best light they can, but they don't lie.

I'm basing my calculations on real benchmarks of Fury X with the same amount of shaders and 1GHz core clock. Vega will be clocked at 1.5GHz, so it will be at least 50% faster than Fury X ( I'm not taking the Vega and Polaris optimizations into account), and therefore at least at 1080 level.

read the small text, nvidia used the 1080 launch driver as the reference, which is a year old almost

That's why their plebbit mods on a payroll delete anything that can hurt ayyymd reputation?

I don't know what that's supposed to mean or what that has anything to do with benchmark accuracy.

Who gives a flying fucking shit about plebbit

>Vega be able to actually compete against the 1080ti?
No, the titan xp is not even the full gp102. (Quadro p6000 is faster)

>Let's be honest now, will Vega be able to actually compete against the 1080ti?
No. Vega will compete against Volta which isn't even coming out for a year.
Vega is not made to compete with the plebeian 1080Ti.

It took Nvidia 3 years to only somewhat catch up with GCN.

This is the first time AMD has actually been behind Nvidia in a long time, and it's only because they've put all their focus into not delaying Vega instead of trying to compete with Nvidia on 7 year old architectures.

It will become a good gaming chip.
Many games are maxing out a 7700k at under 144fps. The only way to get 144fps for many games will rely on them optimizing for 6 and 8 core CPUs instead.

Holy shit

>(you)

That GamersNexus dude was literally clickbaiting for DAYS with Ryzen. Free popularity I guess.

>It took Nvidia 3 years to only somewhat catch up with GCN.
this sounds retarded but it's true

youtube.com/watch?v=XOGIDMJThto

async is in use for development in consoles since 2009, I think it's the biggest reason why consoles went AMD

yea, but his review now is holy grail of reviews everyone should listen to even though it's been wrong for 5 days

>7000 series destroyed by 500 and 600 series
>> Seriously comparing godlike GCN to Fermi in 500 series
KKKK.
>200 and 300 series destroyed by 700 and 900 series
Nod really, 290X did good against 780 and 970.
> >400 destroyed by 1000 series in legacy API
Who cares.

Anyway, there's one major problem with Nvidia: those cards die after a warranty period. AMD cards can die, but they do it after a month or so, you can RMA them. With Nvidias it looks like a setup.

Who the fuck listens to any youtuber besides fucking Wendell? They are a bunch of incompetent morons, just watch last Tech City episode.

Consoles want AMD because they are cheaper. AMD didn't make alot money with consoles. The profit margins are very low.

The real cash is in machine learning and Radeon instinct solutions are very impressive on paper this far.

Imho Vega didn't surpass the 1080Ti and it's delayed to optimize enterprise VR solutions, I really don't see Vega having the same meteor impact as Ryzen although that would be impressive if it happens. The nature of the market in the GPU industry (gaming being cornered and so diverse) differs so much from pure processing

No, consoles went AMD because only AMD can offer decently powerful complete package of CPU+GPU.

The RX 480 also has a pixel fill rate defecit, yet performs well above the 1060 that has twice its pixel fill rate at higher resolutions.
This is down to the RX 480's superior cache that's improving even more with Vega.

>there's one major problem with Nvidia: those cards die after a warranty period

But the fact is AMD don't make money with the PS4/Xbone, look their financial quarter

>will Vega be able to actually compete against the 1080ti?
Probably no, but literally zero fucks given in my use case. I'll probably end up buying it anyway. While Nvidia locks its shit down tighter than a chastity belt while the king is on crusade, AMD keeps it open. It was so much fun playing around with tweaking and BIOS editing the RX480 that Vega can't come soon enough.

>mfw just passed on buying a brand new GTX1080 for 390€, because got no use for it

>They are a bunch of incompetent morons
>just watch last Tech City episode

isn't it a bit contradicting?