Why haven't you bought a fully compatible DX12 card yet Sup Forums?

Why haven't you bought a fully compatible DX12 card yet Sup Forums?

Other urls found in this thread:

computerbase.de/2016-09/deus-ex-mankind-divided-dx12-benchmark/2/
youtube.com/watch?v=CAOZlEZ5WHg
twitter.com/SFWRedditImages

what benefits does it provide?

>That 390x killing it.
It's hilarious that this card costs about the same as a RX 480 (maybe a bit cheaper now) and is consistently just under cards almost double it's price.

No it's still minimum +60$, +100 compared with the 4gb model

I did, but Nvidia broke their promise of giving DX12 support to Fermi owners.

>fury x beating the shit out of 1080
hmmmm

I'll probally upgrade from my 780 soon-ish

i could buy a 1080 today no issue, but i feel the prices are bit out of touch.

a 1070 costs same amount i paid for my 780 back in 2013

1070 is twice performance of my 780, so there's that tough.

in dx12, 1080 gets 110 fps in dx11
nvidia made another dx11 asic, it's truly just a overclocked maxwell stopgap gpu that morons did buy

>980 TI only doing 62 fps
>970 doing even worse
This can't be right.

only pascal arcitechture supports feature level dx 12_0

AMD updated their 390x revision to support it aswell as polaris

Because there's no good videogames that use Dx12 yet.

And now for in-game benchmarks.
computerbase.de/2016-09/deus-ex-mankind-divided-dx12-benchmark/2/

>computerbase.de

>lower framerates at DX12 than DX11
That just proves what an utterly flawed implementation it is. DX12 is meant to reduce API overhead, if you're getting REDUCED performance then the application programmers are doing something very wrong.

kek anyone remembers them dx12 everything double the fps incredible hihg speed

>nvidia can't do now wrong it's filthy developers that fuck them up

NVIDIA HAS NO AYYSYNC

You're a fucking retard, the chart shows reduced performance on DX12 on AMD too

Why the fuck would you ever play this game on 12 if you get higher performance on 11?

Devs have said that Deus Ex DX12 is broken on high end cards. DX11 is faster than AMD in DX12.

>synthetic benchmarks

outside of the benchmark dx12 is broken in de:md,

it shows 15% increase on 480 in guru3d benchmarks, i don't know what you are talking about

In Deus Ex? Yeah feel free to post that graph.
Because this one doesn't support your claim

>Have two 980ti's
>Deus Ex runs slower with SLI on
>Runs faster on 1 card
>Can run Battlefield 1 on 1 card with 1440p, over 60fps

Fuck whoever ported this shit.

It feels good that you paid double the price of my 390 yet we have the same performance.

I won't even mention anything about poor 970 owners, I'll just let them in their misery and buyer's remorse

That's somehow an argument?

My GTX 260 is still fast enough.

I'm fairly happy with my 970. Several bad experiences with dying AMD cards and really unstable performance. Not defending the shitty Nvidia business ethics, or saying the card is fine and it doesn't have an impact, but compared to other issues I've had, it's something I can live with. Upgrading to 4k is out of the question, but that's it.

So, i never bought AMD videocard. How are they? I'm interested in flagship models mostly because freesync. Do you need AMD cpu for it to work properly?

SO DX12 IS EVEN SHITTIER AND RUNS SLOWER THAN DX11 LMAO

...

They're fine.

>Do you need AMD cpu for it to work properly?
No

I will assume you are serious.
no you don't need amd cpu
they are fine all memes aside and 4 year old flagship still kicks ass unlike dead 780ti
but flagship for new gen is not out yet, vega is in ocotber the earliest

>vega is in ocotber the earliest
That's wishful thinking. Vega is 1H 2017, so Q2 2017.

What part of canned benchmark vs. actual gameplay benchmark do you find so hard to grasp?

benchmark is marketplace in golem 1:1
of course it performs slightly differently in prague

it's same thing as with division when nvidia performed worse in cutscenes and amd worse in gameplay for some wierd reason
doesn't mean that benchmark is somehow not representative of how engine works

The game doesn't even actually support DX12. What's the fucking point? By the time they release it everyone will be done with the game.

>all these shitty ass charts

You can only trust the end user, and the end user, myself, with a 6700K and a 1080 G1 GAMING gets these results in 2560x1440, Ultra settings, no MSAA in DX12.

tl;dr : DX12 is worthless in DXMD and the game runs like fucking utter shit like it ever did, refund if you can

Yeah, I'm serious. I need opinions of current owners to decide.
Do you use Freesync? Any downsides?

>Deus Ex
Why does such an ugly game run so poorly?

So why does DXMD run so poorly despite running like shit?
Homefront might be a crap game, but at least it doesn't run and look like trash.

I switched to 144hz, input lag is just too good and no obvious tearing present without v-sync

when I used freesync it was fine, no tearing or lag(it's just that 144hz less than 3ms input lag is much better at any fps)

I have sapphire 390 nitro though, so not a flagship.

just uninstalled it, it ran same 45-50fps at 1440p just like MD

>that 970

JUST

>but at least it doesn't run and look like trash.

>that tree

So, no point in freesync/gsync if you can reach 144fps in game?

Are you running on max settings?
If so, that'd explain it.

because most of my backlogs consists of pre-2014 DX10/11 titles. by the time i'm finished, I reckon I can get a GTX1450/1350 for under $180 & play Mankind Divided at 60-70fps

god this is hard to watch as someone who just got a GTX 1080 for their 1440p monitor

meanwhile DOOM runs at locked silky smooth 96fps with every in-game setting maxed

Well DOOM is optimized by people who actually know what they're doing.

DOOM is capped at 200 FPS. And you can actually hit the cap on decent hardware

Not to mention, it uses Vulkan renderer which brings performance improvements to both AMD and nVidia cards on all supported OS and not W10-only DX12.

Yea, DX12 as a whole is generally pretty shit, the performance boosts in most games are either negligible or non-existent.

well duh, but you can't you need 4k60fps STABLE in any game card
1080 can't bruteforce that, couple more gens
point is if you have 144hz monitor sync tech loses it's point, it's usefull in 70-30hz range
after 100hz you can't see tearing only if fps drops are very huge like losing 30-50 fps and going back up in same moment

I thought Homefront was a console game?

I think Vulkan will get used more and more, there's a fairly big consortium on board with people like Valve and Nintendo, and Android supports it now

Yeah, game is just too empty, even farcry3 had more content in it.

DX12 runs like shit on Nvidia, this is not surprising.


Meanwhile AMD gets solid 60 FPS.

Because i'm a NEET with no money

huh, I remember this screenshot from yesterday
isn't it you friends rig? get you story straight

>isn't it you friends rig? get you story straight

No i'm "playing" the game on my friend STEAM LIBRARY, the right is mine.

With DX12 the devs have to do what NVIDIA and AMD driver teams do with DX11. Devs have to write architecture and SKU specific optimizations in their engine or the performance will be shit for those products which don't have those optimizations.

This fits AMD well because their driver team is shit but their architecture is same as in the consoles so the engines likely have GCN specific optimizations already so AMD is saving money when devs use DX12. For NVIDIA this doesn't work as well because devs are not likely to write whole new rendering path for NVIDIA GPUs and only leverage AMD specific DX12 features. Using DX12 also makes it impossible for NVIDIA driver team to fix inefficient game code by swapping it with their own with the drivers.

>the rig

sorry

>DX12 runs like shit on Nvidia, this is not surprising.

DX12 requires the dev to reimplement the low level rendering code that was handled by a graphics driver in earlier versions. This is the code you can't reuse between AMD and nVidia. If you fail to do, you get even lower performance than with DX11. In DXMD case the devs have basically admitted they haven't done the required work for nVidia cards yet (see their known issues section) and they are working on it.

Nice. Then I'll skip this generation and wait for 144 4k displays and videocards that can utilize them

Mfw bought a 390x over a year ago

I'm not going to use Win 10.

Guru3D was only using the built-in benchmark at "High" settings (more like "Medium" since "High" settings are two notches down from "Ultra").

Non-retarded sites like Computerbase.de got completely different results with "Ultra" settings and actual gameplay instead of synthetic benchmark.

*also on PC

Nah, the game just flat out runs like shit on every brand.

Deus Ex runs shit on DX12.

Is it any good, or another codlike?

It's mediocre.

guess what OP I bought a R9 390 a year ago and upgraded to a GTX1070 DX11 card, you are probably furious

If I just want 1080p 60fps for all games, which card should I buy? I was thinking GTX 1070 but that shit $399 and people say it has coil whine or whatever.

Strictly 1080p, you can easily manage with a GTX 1060.

Apparently the 1060 can barely do 60fps on Witcher 3 though. I'm hoping 1060 can last me a few years at least. 1070 will definitely give me 60fps for at least 2 years I think

meh, just replay farcry4 it has more content in it
it's free for this weekend try it, it stopped at 3 hour mark

GTX 1070 drops bellow 60fps on ultra in GTA V, unless you want no AA

I can do 60fps in witcher 3 at 1440 and 1060 can't?

>barely do 60fps

Not true at all. Keep in mind that this is absolutely maxed out without hairworks so if you drop shadows to low (zero visual impact) and turn off the useless meme post-processing effects you will gain around 15 fps.


youtube.com/watch?v=CAOZlEZ5WHg

No GPU on Earth can handle their shitty Ultra quality grass

NVIDIOTS BTFO

>drop shadows to low (zero visual impact)

>(zero visual impact)

>drop shadows to low (zero visual impact)

>380x stronger than a 970
What in the fuckin hell

Tell me which of these pictures are taken on ultra shadows and which are taken on low. Go on, I'll wait.

Did you fall for the meme card?

Because Nvidia will only implement full Dx12 support by 2025.

Probably the bottom is the highest since i seems a bit more sharp.

no

Actually it's top left is ultra and top right is low and bottom left is low and bottom right is ultra. I think my point stands, you might as well drop the shadows to low-medium since the visual impact is so minimal.

I played enough with settings to know that shadows on anything but ultra do not cast branch shadows from trees that detailed.

So do you think the 1060 can last me a few years, and 1070 is not necessary? (for just 1080p)

Fuck off retard. Nvidia cards support up to 12_1. It's amd cards which only support upto 12_0.

I went from 780 to 1070. Wish I'd waited to get a 1080ti though

For just 1080p? Yes, I don't see any reason why it wouldn't last you a couple of years. And when comes the point where you can't simply just put everything on ultra and call it a day, you have to get smart and know which settings have massive FPS impact with relatively low visual gain, just like the example I gave with Witcher 3's shadow setting.

>want to upgrade from 970
>won't buy nvidia after DisplayPort Maxwell series disaster
>AMD won't have anything better than the Fury X until next year
This sucks.

Intel iGPUs currently have the biggest set of supported DX12 features

>w10 required
Not interested