Stop blame AMD

Vega is shit only because the foundry is shit, TSMC is better than GlobalFoundries with GPUs. The Vega desing is good.

Other urls found in this thread:

pcgameshardware.de/Radeon-RX-Vega-64-Grafikkarte-266623/Tests/Benchmark-Preis-Release-1235445/3/
twitter.com/ryszu/status/896304786307469313
anandtech.com/show/11102/nvidia-announces-quadro-gp100
anandtech.com/show/10222/nvidia-announces-tesla-p100-accelerator-pascal-power-for-hpc
twitter.com/Radeon/status/671058196547706880
tomshardware.com/reviews/amd-radeon-vega-frontier-edition-16gb,5128-6.html
forum.beyond3d.com/forums/architecture-and-products.38/
twitter.com/SFWRedditGifs

hardware's fine, how about they hire more fucking driver devs and actually start using all those features

with more nigs they could start doing more game-specific hacks the way nvidia does

>AMDrones denial

Vega is an overrated, overhyped hot piece of shit, its only use is for miners/workstations or AMD fans willing to pay extra money for lower performance.

Vega suck on every games and efficiency, this is not a specific driver issue.

Vega was only intended to be clocked around 1.5 GHz ("25 TFLOPS fp16"), but they did another factory overclock special when performance in actual games turned out shitty.

Whether that's a fundamental hardware design issue or just really poor drivers remains to be seen, but "efficiency" is primarily a function of how fast a company wants the standard clocking to be.

It was actually 225W at 12TF (SP) originally. Then 12.5TF (25TF half precision), and then 13TF. It kept going up...

Thread.

>Vega suck on every games
the most funny thing about this launch, shills are not using graphs, like at all
580 launch board was full of graphs because it was neck and neck to 1060
vega - nope no graphs, just words in category of "no (you)!"

really makes you think

Yup, no graphs what so ever...

No graphs at all.

1080 yearly cost ~$70
vega 64 at 350w powerdraw yearly cost ~$80

hilariously they have identical temperatures

A year and a half of extra development time, and this is AMD's response. No drivers. Over volted cards to salvage as many shitty dies as they can muster. And worse performance per watt this side of Fermi.

>you will never be a desing

Nope, the problem is that Vega was build around Professional and general compete market not silly-ass gamying.

AMD pretty much gave-up high-end gayming market once Maxwell came around and wreck their shit with the 970.

Vega 64s are really just rejects that ate too much power for Frontier and Instinct markets.

>Shilling so hard for nvidia
They would never even give you $1 to a neet like you

Actually, Vega outperforms Hawaii and Fury and it is more energy efficient then them. The problem is that Pascal is go bloody efficient thanks to 16nm process and building upon already efficient Maxwell.

>Facts are shilling

Don't start none, won't be none.

>facts are shilling
Even if nvidia is better stop wasting your time by posting so many comments with pics too.
You are wasting your own life and defending a big corporation who wouldn't give you a fuck

Received mine yesterday, and it's total garbage.
Mostly because it's loud as fuck.
I'm really surprised it makes so much noise. I had reference rx480s before and they were half as loud as this is.
Also, undervolting doesn't seem to work at all.

>buying a blower style, ever
you are an idiot.

>Vega is shit
I want Sup Forums to get out. Vega is a workstation card and it excels at that, gaming is only a secondary task to it and whether it is good at it or not shouldn't matter for the hard working white man who will be buying the card.

I don't mind, really. It's not like I game that much to begin with.
Maybe I'll stick an after-market cooler on it later, if they ever fix their drivers.

>I don't mind, really
you just did.
>Maybe I'll stick an after-market cooler on it later
this has to be your priority.

So what is the point of the Vega (((((RX)))))? Workstation cards like Quadros are good on gayming.

It's most likely gonna be the same as the 480s.
Do benchmarks for a week to find good settings.
Then don't remember them when driver update.
So fuck it and default clock all the way.
Maybe play an occasional AAA game, only to return to Factorio/Kerbal Space Program.
I have no Idea why I keep upgrading for no reason.

Cheaper than the Quadro, and a working man doesn't think about gaming when buying a workstation card. Get over it Sup Forums, AMD doesn't care about you anymore.

Gaymen drivers for RX Vega are not ready yet.

>I have no Idea why I keep upgrading for no reason.
it is simple. because you can.

Shouldn't AMD be responsible for releasing card without drivers?

>Gaymen drivers for RX Vega are not ready yet.
[Citation]
>AMD doesn't care about you anymore
This is the problem

>[Citation]
Here, catch.
pcgameshardware.de/Radeon-RX-Vega-64-Grafikkarte-266623/Tests/Benchmark-Preis-Release-1235445/3/
Well it has drivers.
They are alpha-quality though.

>pcgameshardware.de/Radeon-RX-Vega-64-Grafikkarte-266623/Tests/Benchmark-Preis-Release-1235445/3/

Can you elaborate?

>rx vega suck because it's a worksation card

Oh boy...

>Can you elaborate?
Tick the RX580 in the list and look closely at the number of culled tris.
NGG path is currenly disabled for RX Vega, it works like bigger Polaris.
PDA is working though, just compare it to Fiji, clock per clock.

>This is the problem
It is not, AMD was (and maybe still is) in a financial crisis. When you are in that situation you have to trim the fat and focus on what brings the maximum amount of money (in this case business clients and professionals).

Gaming brings the most money, even for nVidia.
It's just gaymen market is extremely volatile if you're not nVidia.

you just undermined yourself in 1 sentence.

>what brings the maximum amount of money (in this case business clients and professionals)

yeah, I never saw a single mi25 nor wx 9100 in my life

Linux plx go

>[Citation]
Hilariously enough, you'd have to be an Nvidia fanboy or an idiot to not understand this.

Every AMD GCN generation save for the blitzkrieg-up-Nvidia's-ass 7000 series (which, I remind you, was the last AMD gpu arch Raja Koduri was 100% involved in from start to finish before leaving) underperformed against their value competition for about 8-10 months until driver development caught up with the card's programming needs.
Just compare GCN's programming model to Nvidia's [Kepler/Maxwell/Pascal] model -
GCN is a clusterfuck "jack of all trades" compute-oriented (read: throughput at the expense of latency) design which needs a ridiculous amount of hand-tailored code to function optimally for any given non compute workload.
It's that bad for gaming. Gaming needs latency-over-throughput functionality.

GCN is the exact opposite of a gaming arch, and the fact that drivers can extract performance equivalent to the competition, no matter the manpower AMD has or the time it takes, is a testament to how resilient the architecture really is.

Vega is GCN in ISA only.
That and some legacy features like legacy geo pipeline.
Anyway, if the OG Greenland was bigger Polaris - Pajeet did the right choice of canning it in favour of current Vega.

So thats why ''Gaymen drivers for RX Vega are not ready yet''?

>NGG path is currently disabled
No, you're just bad at reading.

The NGG fastpath requires developers to code specifically for it. Maybe in time to come AMD driver teams will also code for primitive geometry functions to use the NGG, but primarily devs need to do the work and implement it into their engines.

Basically, all older games and many released in the near future will not use NGG functions. In the future it's /possible/ that more than not will be covered with driver work and/or developer work, but it's unlikely due to market share.

>The NGG fastpath requires developers to code specifically for it.
HAHAHAHAHAHA
WHAT?
twitter.com/ryszu/status/896304786307469313

Vega is like a GP100? (a pure compute card)

>the absolute state of amdrones

>bad facts about amd
>bawww nvidia shill bawww let me shill for amd baww
>amdrone cherry picks facts about amd and gets called out
>bawww shitposting

AMDrones are cancer who ran this board into the ground.

No.
Vega20 would be like GP/V100.
This is more GP102-esque thing, with additional benefit of packed math.

>GCN is the exact opposite of a gaming arch
I guess that is why it's in every console.
Seriously, in AMD place i'd ultimatum the heck out of developers to optimize for amd and threaten firmware changes to make specific games run terrible on consoles.
games developed for consoles first(AMD) but run worse on AMD, oxymoron

things are moving in this direction I guess, sweeney on amd stage is a big sign

These stats are for SMX2 GP100, lmao.

>sweeney on amd stage is a big sign
WHAT?

anandtech.com/show/11102/nvidia-announces-quadro-gp100

anandtech.com/show/10222/nvidia-announces-tesla-p100-accelerator-pascal-power-for-hpc

when they announced all the vega stuff, raja gifted him threadripper, it was funny
I'm surprised nobody talks about it, not many people watched presentation I guess.

Raja gave Sweeney Vega Nano though.
Anyway this is big.
Like, BIG, EPIC was always pretty much hostile to ATi/AMD.

Nothing in the Vega whitepaper suggests that the NGG/primitive geometry functionality is simply transparent and works in every case by using basic driver routines.

>The [NGG] path is much more...programmable than before
>wide variety of rendering technologies
>that could benefit
>we can envision even more uses

Instead, and going by previous whitepaper language, the takeaway is that Vega's NGG requires specific optimization for every use case to perform properly.

And we all know the state of AMD's driver development.

The consoles have their own proprietary APIs which are specifically tailored for graphics processing from the beginning.
AMD has to take DX11, DX12 requirements and mold the code to their GCN ISA's capabilities.

It's quite a different beast, but if we're to complain, it's really the fault of Sony and M$FT.

>Nothing in the Vega whitepaper suggests that the NGG/primitive geometry functionality is simply transparent and works in every case by using basic driver routines.
LITERALLY
AMD
DRIVER
DEV
CONFIRMED
THAT
NGG IS TRANSPARENT
Please do NOT shitpost before doing a fact check.

user, Rys is the driver dev.
the fucker promised prim shaders to work on launch day though, probably confused builds
so fun things yet to come, not beta vega driver is not out yet, so who knows maybe it will gain 10%

Go ahead and be a complete ass WRT ignoring the words used.
I did concede and I did stop saying game devs would have to code for the NGG's full capabilities.

Instead, I said that
GOING
BY
PAST
EXPERIENCE
WITH
THE
COMPANY
It's going to be a cold day in hell before we see this new geometry technology fully working in most cases.
>because AMD's driver team is anemic and overloaded

So what's the difference between GCN and NCU(Vega)? Is NCU the next version of GCN or an entirely new architecture?

Pic: photoshop from a Redditor

>GOING
>BY
>PAST
>EXPERIENCE
>WITH
>THE
>COMPANY
I will never buy nVidia card.
Thank you card-killing drivers!

>i don't know how hardware works: the post

recent 3 years driver team got a lot faster, specially last 8 months
pretty sure they simply ignored gaming card and focused only on pro line drivers, i'd say this october with new crimson iteration we might get another boost

hardware is NOT fine lol.

AMD fucked up.

Hardware is literally fine in every way possible.
Also, reddit spacing.

dude, performance wise it could be on par with a 1080 for sure, but it draws way too much power doing the exact samt thing.

so its not an efficiant card.

You need to go back.

>amd didn't release a driver locking fans at 20% and killing countless gpus
twitter.com/Radeon/status/671058196547706880

NVIDIA did it first, and several times.
So no,
>going by past expirience
is a meme.

and again, it's relevant how?
temperatures the same
noise level the same
electricity bill $10 difference per year
potential - who knows how faster it will get

to be fair, it only happened if you used afterburner, still a fuck up
also nvidia murdered laptop displays last year, which is worse than $200 gpu

>Hardware is literally fine in every way possible.
so this is what an amdtard looks like

You need to go back.

back to /r/amd

Am I triggering you?

Are you

becoming triggered,

by my use of line spaces?

>do
>we

>have
>a

>problem?

>it's the drivers

yes vega a card that is literally fiji on a die shrink sucks on games and its not a driver problem
meanwhile when i post this tomshardware.com/reviews/amd-radeon-vega-frontier-edition-16gb,5128-6.html
not a single hater can explain how a card that is shit because the hardware is shit its literally shitting on nvidia on anything compute

really makes you think

The benches you posted are not compute.
And shitposters gotta shitpost, user.

Power saver mode. Custom fan profile. And maybe undervekt it a tad.

Should solve it

Ffs man it is not.

Fine would be if it performed on par with a 1080 on a die 50% smaller and consuming half the power it does now.

That would be a good chip.

what good is a uarch design if your driver team can't make good drivers for it. Its a total package baby

you are blaming the hardware while everyone agrees that the cards dont even have proper drivers?
get your illiterate ass out of here and rma your self

Hardware is fine, they just don't have enough driver devs so they launched the card with alpha drivers that have major critical uarch features still disabled.

Primitive shaders is a critical uarch features designed to overcome GCN's 4 triangles per clock limitation, and without it activated Vega is massively front end bottlenecked, meaning you can't even see the full benefits of primitive discard and DSBR later in the rendering pipeline. Until they fix this shit, who fucking knows what Vega is actually capable of in games?

Well you still have baseline perf right now.
It's
>it can't get any worse
situation.

>Nope, the problem is that Vega was build around Professional and general compete market not silly-ass gamying.

False, this is only true in the sense they concentrated on getting the pro drivers finished first. The arch is fine for gaming assuming the uarch features work as designed once actually implemented in gaming drivers.

DESINGATED

>total system under 500
>bad
KEQUE

>wait for drivers
They had 2 years.
kys

>They had 2 years.
2 years....
vega taped out early to mid 2016

>2 years

Are you seriously this retarded? Everyone even adoredtv says it's a powerhog.

Shitty fps/watt. Drivers cannot solve this.

So please go suck your father's clit.

>drivers can't solve that
HAHHAAHHAHAHA

let us talk a bit serious because its clear that you lack any understanding regarding gpu's
currently after a lot of testing on beyond3d(yes im the fag that posted a lot of info about it yesterday since i own 2 of them)
1)there is NO power profile the card once its fired up for 3d it literally fires up EVERYTHING regardless of them being used or not
2)simd pipeline profiles are not there what so ever thus making everything forced to be paraller while they dont need to->insane power draw
3)culling in general is disabled otherwisth with a buffer of 17 triangles we would have seen insane perfomance especially on those shitty nvidia games
4)hbcc is disabled and given how it probably will need specific profiles for each game it will be some times till we see something
5)adored is a youtuber and nothing more
currently most of the stuff IS working on the pro driver thus why its ahead 90% of the time than a p6000...
its a fucking driver problem because amd back then didnt had money to throw on 2 teams to develop at the same time they couldnt experiment with the older cards and at the same time develop the new shit for vega
if amd fails with navi (assuming that they DO have money now to throw at it ) i guess they will just leave from the gaming market all together

They didn't have two years and they didn't have enough driver devs to finish the pro drivers and the gaming drivers at the same time. They chose to prioritize getting the pro drivers done first, which is why Vega FE trades blows with a P6000 in pro 3D rendering applications already.

OK fair points. I'm not saying drivers won't increase performance. And let's say they do. You will maybe see 25% increase overall. Tops. It's still a power hungry card non the less.

550watts is a bit much on turbo.

>Drivers cannot solve this
Actually they can, since one of Vega's major uarch features which would simultaneously increase performance in gaming and reduce power wasted on processing primitives and their attributes that would be subsequently discarded anyway isn't working yet in gaming drivers.

If they can get primitive shaders to work like they claim in the white paper, Vega will have nothing to worry about in perf/watt terms once they actually get the drivers out of fucking half finished alpha stage.

Can you run Kanter's trianglebin test and capture it on video?

>25%
That's a pretty pessimistic guess.

Which is why navi is so fucked compared to volta.

we know that the hbcc as per tomb raider can give at least a bump of +30% on the minimums alone
tile based rast is an unknown factor since we dont know how well it runs..
in 2 days i will run more shit and post it here
forum.beyond3d.com/forums/architecture-and-products.38/
make a list of what you want

Hey, thanks for your contributions on B3D, I've actually been learning a lot from there. It's only thanks to that thread I now have some fucking clue as to what exactly primitive shaders are supposed to do, and why it's such a big deal.

Except it beats the 1080 in most casea and only draws 10-20% more power. Literally less than $10 a year when run at max 24x7x365.

>And let's say they do. You will maybe see 25% increase overall
>25%
>25%
It will rape the Titan lol.

An you know it's getting better because the architecture is new and the drivers are raw.

Right now it's better than the 1080 from the start. It will pull way ahead given some time.

The 480 was overall weaker than the 1060 at launch, now it shits on 1060 all across the board.

>tile based rast is an unknown factor since we dont know how well it runs..
According to AMD it gave 30% bw savings on average with 10% boost in average FPS.
Prim shaders are nice - more culling and less L2$ flushes - all in one neat package.
Kudos to Pajeet and his team for that.