DX12=meme

When will AMDrones finally admit that DX12 is a meme?

Other urls found in this thread:

gamersnexus.net/guides/2561-no-mans-sky-frametime-performance-review-poor-performance
jonpeddie.com/publications/add-in-board-report/
pcgamer.com/total-war-warhammer-dx12-boost-for-amd-still-cant-match-nvidias-dx11-performance/
twitter.com/NSFWRedditImage

>DICE literally say that the beta's DX12 implementation is incomplete and shouldn't be used
>tech sites start benchmarking it and complaining

B R A V O
R
A
V
O

Porting stuff to DX12 or even developing with DX12 as base is more complex than DX11.

Where is Vulkan API actually?

>Thinking that any improvement to DX12 will be made with less than 2 months until release

literally kill yourself

You seem to forget that dx12 is broken in the game and they say not to use it. Silly shill fuck.

...

>dx 12 is broken.

Fixed that for you.
There is no performance gain to been seen, not even on AMD cards.

We will stick with DX11 still a long time.

Not OP but is this the new explanation devs will give for stuff? This implementation is incomplete but will be fixed soon? Will they offer the fix as a DLC?

Directx always was a meme from day one. I hope Vulkan takes off and becomes popular just so that Microsoft stops developing directx and focuses on Vulkan instead.

Now you know why it's a bad idea to give control to devs over graphics hardware. They dunno what they are doing.

Where's that assmad amdrone who was blaming nvidia for "paying off" the day sex devs to release dx12 mode a month late? The retard probably thinks nvidia rigged bf1 too.

As usual AMDrones wait patiently, forever.

DICE has been AMD sponsored for years and AMD basically wrote their mantle and dx12 renderers. I seriously doubt it's incomplete, much more likely that it's just damage control to hide the fact that NVIDIA has better real world performance.

ITT nobody realises Nvidia released a driver for the BF1 beta, AMD has not.

DICE uses AMD for all their dev systems and has AMD engineers working in their offices. I seriously doubt AMD hasn't optimized their drivers for BF1 already.

>muh drivers

Funny. I thought people were so excited over DX12 because it got rid of AMD's shitty driver disadvantage.

What AMD has and what they have released are two different things. I have no doubt there will be day one driver support from AMD for the release of the game - they just haven't provided support for the beta.

You still need driver tweaks to squeeze performance. All we know right nowq is that DX12 provides negative scaling over DX11 in the beta - thats all. Things may change upon release or they may not.

>visual fidelity remains the same
>but with lower performance

nvidia should be praised with how good their dx11 drivers are.

>mfw nvidia performs better even on amd written software

That sounds really unlikely and stupid, they would at least have dev systems with hardware from both vendors and collaborate with both companies on how best to schedule shit.

Yep DX12 is basically pure ass

There's a reason why nobody fucking uses it, it's unoptimized shit

The actual reason is that most studios rush shit out the door, or hire a horde of monkeys to do the programming, or both. With DX11, Nvidia and AMD usually end up reengineering the entire renderer, trying to manually configure and work around all the developer's retardation. With DX12, the developers are given more raw access and shit gets fucked if they fuck it.

Dx12 is broken in almost every game.
Devs are just porting it over very poorly

It's a free beta, you dumb autist.

Except Nvidia cards perform worse in DX12 too. And we already know that Frostbite 3 performs well on AMD cards, both in DX11 and in Mantle. Battlefield 1 is far from the first game to use it, or to implement a bare metal API into it. It's a beta. If you're taking anything from it in terms of performance, after being explicitly told not to by the developers, you have brain problems.

Who gives a shit about DX12 anyway, it's all about Vulkan

Its not broken in nearly every game, just in tomb raider (funnily enough an Nvidia sponsored game).

DX12 is more than a magic performance boost you know. What it gains in performance is immediately lost in fancy new graphic features.

Broken doesn't mean Nvidia doing better than AMD you shill, broken means neither side not gaining or losing performance.

>tfw feel guilty for building an intel+nvidia system because you associate it with autistic neckbeards who shitpost about GPUs on Sup Forums

sure, there's no better alternative because AMD products are shit but to think im being associated with these gaymer kiddos with their l33t razer shit that sit in basements and get into these dumbass arguments feels awful.

They're paid by AMD not to do that. DICE has been in AMD's court for years now.

^this
If there is no performance gain in DX12 then there DX12 implementation is broken. The whole point of DX12 is the performance gain.

Apart from the fact that AMD is seeing a performance gain every DX12 title released - even tomb raider these days.

b-b-but we do senpai

Vulkan uber alles.

WHO GIVES A SHIT ABOUT DX12
AMD NEEDS TO MAKE ACTUAL OPENGL DRIVERS
CANT PLAY MUH PS2 GAMES WITH OPENGL PLUGIN BECAUSE AMD HAS NO SUPPORT FOR 4.5 REEEE

>amd
>drivers

>benchmarks of betas

Has Sup Forums really stooped to Sup Forums levels of meming?

IF THEYRE MAKING DRIVERS FOR ALL THIS NEXT GEN SHIT WHY ISNT THERE OPENGL

DX11 just werks™

yes dx12 is a vendor-locked piece of shit

Not for a lot of games.
Some get tiny graphical glitches, some are fucking unplayable.

Haha having to search on a german website a benchmark on your side...

you fucking retard

Tomb Raider is pretty solid nowadays, it's one of the very few games with DX12 multi-GPU support and it actually runs better than DX11 and SLI/CF. I played it with 2 1080s and a 4.7GHz 4790K at 4K, in DX11 I had drops to 40FPS, DX12 never dropped below 60 at the same settings.

>AMD NEEDS TO MAKE ACTUAL OPENGL DRIVERS
Good luck, AMD's DX/OGL drivers are so shitty they're betting everything on the DX12/Vulkan marketing to actually get their GPUs to run properly.

If you've followed the gpu market for the last 15 years, you'd know you go nvidia for working OGL. Funny how AMDrones accuse nvidia of being gaymur cards while they don't have working OGL drivers used in every professional environment.

DX12 is a meme, only useful for certain types of games like aots which no one plays anyway

Because there are no recent open GL games that even need a dedicated GPU to run.

>Nvidia shill makes waiting an AMD meme so Nvidiniggers buy their overpriced shit cards even sooner

Pottery

what about DOOM and Nu Male's Sky

if DOOM didnt get that Vulkan update AMDers on Sup Forums would have one less thing to argue with

Why would anyone use OpenGL when Vulkan exist?

Because its what people already use and have been using for awhile now

>AMDrones will claim NVIDIA paid off DICE

>it's alright if a game is AMD-biased

>no man's sky
gamersnexus.net/guides/2561-no-mans-sky-frametime-performance-review-poor-performance

KEK

its OpenGL so it counts

Isn't Vulkan just a shitty OpenGL?

People use and have used DirectX for 99.99% of games for the last 10 years you autist.

Well, Sup Forums is riddled with Linux cunts.

No you fucking retard.

>AMDfags buy AMD to free themselves from da Joos

>But diss Linux and support da Joos at M$ because only Nvidia makes decent OGL drivers

make up your fucking mind Pajeet

who shill for AMD which don't have proper OpenGL drivers
really makes you think

What's the big difference?

Vulkan and DX12 are Mantle: turbo edition.

DX11 PCSX2 plugin is a complete joke compared to nightly OGL builds
also
>windows

In literally every other game, there are performance gains across the board. DICE isn't done writing the DX12 implementation for the game. It's in beta. Like 3 or 4 other posts have pointed this out. This is not a DX12 problem, it's a problem with the game's DX12 code not being finished yet. Do you understand now?

>amd makes xbone apoo
>amd promotes DX12
It's only natural pajeets to use amd and windows

From a pure numbers perspective gaming IS AMD. GCN powers a lot of Apple products (as limited as its gaming scene is), it powers both trhe xbone and ps4 and lastly is powering every radeon card built since 2012.

GCN's presence outstrips Nvidia's by a considerable margin.

/thread

>Thinking that any improvement to DX12 will be made with less than 2 months until release

Uhhhh... yes? It's a rendering subsystem being maintained by a team of industry experts and only needs profiling+optimization, not a full rewrite.

>release game with dx12
>tell people not to use it
Why? Why would you do such a thing? Just so you can slap "DX12 compatible" on it?

That's fucking retarded.

nvidia has a fuckhuge market share for pc gaming and they have better performing gpus than amd and better driver support

So what youre saying is AMD is only good for Windows 10 Pajeets, consoletards, and Applefags

in the software industry you just release whatever you have regardless of the quality

Considering that a lot of "betas" release two WEEKS before the game nowadays, two months is a lot of time to get that work done.

>nvidia has a fuckhuge market share for pc gaming

Not as much as you may think these days and that still pales in comparison to the number of consoles out there. Another year of the same growth and AMD will be looking very, very healthy against Nvidia.

jonpeddie.com/publications/add-in-board-report/

So the largest userbase? Yes.

because no developer has ever released a game and thrown out a performance fix after its been shipped. 2 months before its even shipped is the perfect time to do it. Dickhead

So what should i get, 1060 or 480 ? They both already run everything i play at 60fps at 1080p but in terms of longevity, i can't decide between the two

im only going with AMD this gen because of FreeSync, Nvidia really fucked themselves over with that one

FreeSync
>buy freesync monitor
>didnt pay out the ass
>everything looks beautiful and silky smooth framerates
>Soon to be supported even on fucking TVs

Gsync
>Only on ugly as fuck Gaymer tier monitors with Ayy Lmao designs (see:Asus ROG monitors and Acer Predator)
>all $1000+ in price due to Nvidia Tax
>Does exactly the same shit no matter how many Nvidia marketing terms that tripfag spams

Worst of all is FreeSync is fucking open source, Nvidia could have picked it up for themselves, but they're thinking "nah, lets just fuck our consumers over and create a mess instead of allowing this new technology to become commonplace and adapted everywhere"

The Gsync shit is a good example of what Nvidia's going to do to the market if they get a monopoly. Hope you shills all have your buttholes prepared.

>6 gigglebytes or 8 gigglebytes

More gigglebytes.

>Worst of all is FreeSync is fucking open source,

The technology behind freesync is years old but to be able to slap the freesync label on a product it still needs to meet certain requirements from AMD (and I doubt the testing is done for free).

Still there are over 100 freesync screens on the market (101 iirc) and considerably less gync screens. Even more annoyingly most gysnc screens are relatively small and only tend to be 1080p, 1440p and a few of those crazy ultra-wide resolutions with very few (if any) gysnc screens being 4k and large size (as in, 30" upwards).

One of the best review websites around but ok.

So why is RotR's implementation of DX12 broken?

>Open Beta
>Dice/EA game
Gee, what did you expect to happen?

either is fine but don't get the 8GB version of rx480 it's a meme and you're overpaying for vram which it can't use properly, it was designed with 4GB for a reason

It was "broken" recently because in many situations it gave negative scaling gpu wise. What few sites tested though is it gave some fairly heft improvements to cpu usage and due to that minimum frame rates on anything but the latest and greatest cpu.

For alleviating some of the cpu bottleneck it did a good job (as thats what the dev's blog post was about) but too many people were expecting enormous performance gains no matter the hardware configuration which just didn't happen.

tl;dr game is surprisingly cpu bound (unlike the previous game) and people were doing gpu testing and wondering why nothing changed.

There are 4K 27" GSync screens AFAIK, expensive as all fuck though, as you'd expect.

The situation is still fucked though, I bet adaptive sync would've been a standard like 4k and IPS now if it weren't for this brand split. AMD is trying to make it more commonplace but no marketshare, and Nvidia are too Jewish to adapt it.

Thankfully all the consoles use AMD so FreeSync TVs might be a thing. Shit's gonna be great.

Isn't the whole -sync shit a meme anyway? If you're playing at 144fps I can't see how it would matter.

Not all games run at 144fps, user.
Sometimes your GPU cant handle it, sometimes the game itself is locked.

I remember playing DSIII when I got my first FreeSync monitor(which is locked at 60fps), it was smooth like butter.

Word was Intel was going to support freesync for their igpu but iirc none of their products actually do. Hell if Intel does end up supporting freesync its basically game over.

Unless you have a constant 144fps frame drops are going to look ugly, which *sync gets around.

This is true, I was unfortunate to have an ATI card when I first emulated PS1 games back in 2008.

But to think they haven't fixed it yet, damn.

...

>which *sync gets around
Not necessarily. Frame time variance is still there and it's the major source of stutter. It gets rid of tearing or in the case of VSync it does make things smoother. If your frame time is fucked up though, it's still going to be fucked up with FreeSync (or GSync).

it's already known

pcgamer.com/total-war-warhammer-dx12-boost-for-amd-still-cant-match-nvidias-dx11-performance/

>GTX 1060 too low to be included on that chart

surely they just didn't test it

Senjou no valkyria was a really good show to, would recommend.

(Girl in pic is mc)

If something just works better on AND cards due to that they're better at raw compute it's fine.
If Nvidia arbitrarily gimps AMD with a proprietary API it's retarded.

>20fps lost in dx12

Should this even be acceptable?

On the console industry it is like this for decades now and no one seems to complaign