Vega a shi-

>Vega a shi-
no way dude! look at 1080Ti

Other urls found in this thread:

youtube.com/watch?v=Kqadf7TfMy8
twitter.com/NSFWRedditImage

>vulkan

>Vega has a proper hardware scheduler
>Nvidia doesn't and uses drivers
>It's unfair that GCN benefits more from newer API's than Nvidia, DX11 is the only API that matters

Aaaaaaaand suddenly wolfenstein is AMDrones' favourite game out of nowhere

>proprietary video game

This guy gets it, now if only I could find an open source chan that didn't die within a few months

I bet you won't even play it because your "team" lost. (P.S. It's actually really great, you should play it if you liked the first one)

>vulkan

Nice job op...also furyxfags are so happy...

THANK YOU BASED AMD

wolfenstein uses fp16 and not even a heavy use of it...

>That r9 390x

I'm not in any "team", I own a 7 year old gfx card.
I played the 2014 one, but I don't remember if I enjoyed it. Wouldn't buy a full priced game either way though, maybe I'll pirate it (probably gonna forget)

Wonder how they benchmarked it. Got through the game on the same settings with RX480 8Gb and I was consistently above 100fps. The game is shit btw, don't waste your dime if you expect DOOM out of it

AMD Fan here(not fan boy)
>cherry picking benchmarks
Come on fanboy you're (not) better than that.

AMD is great at modern triple A garbage that sucks

Oh, they ran the Manhattan level, which is literally a performance sinkhole thanks to pajeet code (it HALVES your fucking framerate compared to the rest of the game). It's almost like the purpose of this benchmark is to make Vega look good at all costs

>vulka

FUCK OFF OP

The game is fine, the gunplay needs way more accuracy though, it's like they thought the horribly imprecise doom gunplay would work here

No way the amd is really good at one or two games? Wow its just like the past 6 years

>One fucking game works out for VEGA
>99% of the other pc games out there run like shit on it
GEEEEEEEEEEEE

(you)

>triple A that sucks
Redundant.

>i like you
>You have no direction in life.
>Youre an inspiration

>the fury x

HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA

feelsbad

>vega 64 as fast or faster than a 1080
>shit
>vega 56 faster than 1070
>shit
pack it in boys, 1080 and 10709s are shit.

it a bench with VULKAN

vulkan sounds better than anything nvidia related

Objectively false

Fury X losing to a 390x and 580?

not my problem if AMD can make use of low level API and novidia only uses dx

>all the numbers are completely different across every card between both benchmark

Really fumble my marbles

>with lower fps and more bugs than dx and opengl

Nvidia drivers update, OP is a shill

>(P.S. It's actually really great, you should play it if you liked the first one)

The game's garbage though. Literal communist propaganda with a QTE final boss for its four hour campaign (and more like two hours without the cutscenes).

>1080p
>2017

What went wrong guys? Feels we stuck on this resolution since 2009

prices of the meme resolutions hasn't dropped fast enough like 1080 did

>Vega has a proper hardware scheduler
>Nvidia doesn't and uses drivers
If that were the problem, DX11 vs. Vulkan wouldn't matter, since it's the exact same shader compiler, just with a different front-end for SPIR-V instead of HLSL.

>GPUs get better and cheaper
>even 1440p monitors that aren't shit cost too much for most people

Remember this leak of the fury x?

>tfw still running a 980ti

Literally best card.

Whatever you say.

>le dedicated gpu for one game/vulkan

The 980 Ti is the real interesting tidbit on this graph. Slowly sliding down the pack and already closer to 580 performance than 1070 or Vega performance, just like how the 780 Ti ended up when Nvidia ended driver support for it.

Counting the days until Volta and the 1080 Ti going the same way. :^)

I have a 760, dunno if it is any good in 2017

This bench is from AMD though

give us some examples m8

Even if it originally were, nVidia has made sure it isn't any longer. See .

a planned obsolescence driver every year keeps the goyim in fear

Since when the 980ti is fater than 1070?

1070 and 980Ti are roughly the same in DX11, but Pascal is better at DX12 and Vulkan so it makes sense the 1070 is ahead.

The only thing I can think of is no GPU physics acceleration, like what nvidia has with physx/cuda.

what lol, i meant as in the name itself haha. Vulkan sounds kickass, or ultra nerdy while nvidia was cuda. pussy ass name bruh. cuda reminds me of soy

this thing is garbage i get ~80-90 on my 1070

>old cards perform worse than new cards
fucking jews
nonono didn't you know, the nvidiajews are downgrading the 980TI in reality it's as good as the 1080 but they make it not work with their (((drivers)))

because nvidia can't async can't dx12 and vulkan. Its also gimpworks certified

>old cards perform worse than new cards

It's not about that and you know it, my dear kike shill. The 980 Ti utterly fucking destroyed Poolaris when it came out. Now they're almost equal. The 980 Ti was on par with a 1070 a year ago. Now it's much slower.

Really boggles my noggin.

It's not. The AI is garbage, the game is piss easy and the main campaign lasts only 4 hours

>Tfw my 290 performs better than peoples 980ti

And that's why they're called ADVANCED micro devices guys. Shits literally from the future

...

>It takes 1+ years for AYYYMD to put out drivers that actually do their cards justice.
Is that supposed to be a selling point?

They COULDA' picked a better name, true.

>Open API doesnt count and is considered a AMD cheat
>locked Gimpworks injected into every nvidia sponsored game thats designed specifically for nvidia cards with AMD not even able to adjust for is totally fair

didn't bethesda partner with AMD recently? could explain the better performance on AMD card

Why is id Software the only dev that gives a fuck about Vulkan. More devs should support it so everyone can just make the switch to Linux without having to sacrifice 90% of their game library. It also shits on DX performance wise.

Meanwhile, at 4k.

Vega is and will be shit until all uarch features work

Hawaii truly is the godmachine of gpus.

and did you consider that new games are plain better running on new cards?
i mean show me the benchmarks of it running on par a year ago and far worse now.

did you not hear of the AMD finewine(tm) technology?

>390x which is a rebranded 290x which came out in 2013 which in turn was designed to combat the 780 beats the fuck out of 980ti
Fine wine at its finest.

too bad it hungers for power so goddamn hard holee fuck

Remember when Sup Forums said the titan was untouchable? Remember when Sup Forums said the 780ti is better than 290x? Remember when Sup Forums said there is no way hawaii could catch the 980?


Its a 250w card like the 980ti is.

>all this baseless conjecture

>2013 card still manages to pull 60+ fps on 4k in 2017
Even though its on Vulkan, how does AMD do it?

meanwhile at colour compression
youtube.com/watch?v=Kqadf7TfMy8

this is how nvidia wins

[Citation needed]

What's the point of the 1070Ti again?

By making superior hardware ahead of its time before the software support arrives

I know right

omg red is so much larger. 1070ti BTFO

really activates my almonds

>Nvidia Chartworks™

Only a handful of studios develop their own engines and have capable enough coders to use Vulkan/Dx12.
Most studios out there just use Unreal/Cry/unity as is and are incapable of any low level memory smartassery

Charts you say?

Anyone wanna work out the percentage difference?

libshit cuck detected.

The game is fucking awful in every facet. 6 hour campaign, poor AI, communist+SJW+Jew propaganda up the ass.

...

kek

Might as well buy a 1080

>i-it's okay when AMD does it!

it vacuums electricity like a jew vacuuming heat from an oven

AMD doesnt lock Nvidia out of developing their own rapid packed math

AMD doesn't lock Nvidia out of anything. Nvidia are free to add Freesync if they so wish. But Nvidia lock everyone out of everything.

That can't be right. I'm an idiot and learned how to use openGL pretty quickly. As for DX, it's only one small step down from unity, pretty high-level stuff.

Are the engines free or do they require licensing deals? It may explain some things if they licensed for a number of years.

>tfw liberal but the over the top 'fuk wypipo' shit turned me off
why does sweden have to ruin everything for us?
they shouldve left wolfenstein alone, 1 was great

>Are the engines free

What do you mean? They're not engines, they're SDK's. You use the sdk to make an engine. openGL and DX are both free.

Unreal/Cry/unity are they free?

No, they are proprietary.

They want royalty if you charge for a product they take a share, if you make it free its free, unity is freeware ie free as free beer and paid extra services like store

Does FOSS 3d videogame engine even exist?
What is the blender of the game engines?