Is AMD driver overhead just a meme?

Is AMD driver overhead just a meme?

Other urls found in this thread:

youtube.com/watch?v=4JHuyLoYU4c
forums.guru3d.com/showthread.php?t=398858
youtube.com/watch?v=X_zASxtpRn4
youtube.com/watch?v=vSDQzlKDYq4
youtube.com/watch?v=04fRG5UixFM
youtu.be/XoKu0_2ozAc?t=18
youtube.com/watch?v=S546TL2LWNY&feature=youtu.be
twitter.com/NSFWRedditGif

What do you think would happen when you compare a fast processor to a slow one?
I dont know, maybe Nvidia has a driver overhead problem as well.

No. It's an actual issue.

>hairworks.
Useless shit that would bog a 5960x down
Thanks nvidia

It's not a meme.

Look at one of the Digital Foundry videos where they tested Just Cause 3 on like a 390.

It would keep up with a 970 until something exploded. The 970 would keep chugging, but the 390 dropped a lot of frames and the frame-time spiked.

Although, from the tests so far, the new RX 480 is improved.

Not just a meme, a Nvidia shilling meme

AMD's OpenCL implementation performs better than Nvidia's. And no one plays games or uses proprietary software.

Here it is, watch at 1:50

youtube.com/watch?v=4JHuyLoYU4c

it's also been documented in a lot of other games.

Are you legit stupid or trolling? It doesn't use hairworks. In actual fact it uses AMD PureHair.

yeah it's legit. most reviews don't show this though because they use shit like overclocked 5960x CPUs and similar setups that you won't ever see in real world usage unless you're rich and like to waste money

forums.guru3d.com/showthread.php?t=398858

I don't know why this game is popular, but whatever.

Here it would be like on my system. Steady 60fps. Not exactly a rich setup.

youtube.com/watch?v=X_zASxtpRn4

Enjoy your 3.5.

GTA 5 on AMD seems stuttery at times.

yes

I never understood why they did that. I want realistic gpu performance, not ideal gpu performance

What card?

pure hair has long gone away from amd hands since the first rotr the devs said they are "created" their own version of it to ship it as a third party

It removes the cpu from the equation. Do you really expect all your favorite titles to be tested with 5 year old cpu's, or bargin bin AMD processors? You faggots hate upgrading, but hate your reduced performance, blaming it on others.

Hairworks is shit, even on Nvidia products.

This is why AMD is pushing for DX12 so hard, they have give up on DX11, when you use a low level api you give up a lot of support and debugging tools that are inherited to a layered system like DX11, not to mention that going the low level api path you're risking breaking compatibility even within the same low level api once overhauls are done to the hardware, meaning that once new hardware is released the low level apis AND games will need to be updated or it will break compatibility, what are you doing to do once DX12/Vulkan becomes popular and hardware changes? games that are old will become unplayable unless the game developer stills supports it, Nvidia/AMD won't be able to change the source code for the game and the amount of work they can do on the driver side is very limited compared to DX11, there are huge implications and sacrifices on moving to low level apis specially with some many hardware revisions.

>It removes the cpu from the equation.

All it does is artifically boost AMD's performance in benchmarks. Nobody is pairing a 380 or 390 with a 5960x @ 4.5ghz with a $400 custom loop.

Can u explain frame-time to me, I don't get it

Hd 5450

I'm still just waiting on whatever point you are trying to make. Turn hairworks on, shit performance, on all cards, with all hardware.
Thanks Nvidia.

No shit retard, those are meant to be used by the %1 top of the line hardware, then once the technology matures combined with improved hardware it will be available for the mid range user, do you even know how technology advances you need to invest first to reap benefits years later.


Someone has to do it even if you don't like it and AMD keeps playing the emo card about how unfair it is, if AMD thinks it is unfair they should invest the money and developer the technology and push it to the public and try to push forward instead of complaining and whining.
Open source or closed source doesn't matter, what matter is that the investment is being done by someone, either Nvidia/AMD or game developers, in the end we all win.

This is correct. My friend has a moderately cheaper build with a 390 and fx 8350 and when we play games like dayz or even some less cpu intensive games like dying light he always runs into issues due to that overhead. Like when he would come out of a cutscene on dying light his fps would tank like shit when it's re-rendering the world.

Are you homeless? Buy a modern card you poorfag NEET.

this has nothing to do with the drivers..................................
using gameworks on a game always results into lower framerates on amd simply because amd till 480 didnt had any way to cull sub pixel triangles
but now they have a primitive discard acc that is complete configurable

280x

Quit comparing benchmarks with useless visual fluff that do nothing to improve gameplay. You friend is a stupid fuck for buying a 8350, You have no reason to complain.

It's not about gameplay, is all about image quality ;)

The envelope must be pushed by someone.

swap to DX 10 or 10.1 will fix the issue. The API handles it differently in that game.

People with 970 and fx 8350 don't have that problem. I've checked many youtube videos.

Let's stop the denial already. Amd has higher driver overhead compared to nvidia and that can't be denied.

Guys, if I start a review site using weaker shit like i5 2400 or i5 750 would you guys take it seriously?

I have some old hardware laying around I'm tempted to benchmark out of boredom, and if it's successful I might add new shit like rx470 to it.

Haha holy fuck. Your image quality, (that you will only notice in static frame shots) is compromising your gameplay by crippling your frame-rate, and this is somehow progress?
Nvidiots, everyone

Every now and then these gpu threads actually have an interesting post where i learn something and this time it was yours, thanks

Use youtube, there are a lot of users that are always looking for actual gameplay videos on low end hardware.

Sacrifices are needed, if you aren't willing to make them then move aside and then the big guys work.

60 fps means a frame need to take at most 16.7 ms to render.

EACH FRAME.

But lets say you have an unusual load in your game. One frame renders in 6.7ms, but the next frame renders in 26.7ms.

Averaged out, that's still 60 fps - (6.7+26.7)/2 = 16.7ms aka "60 fps"

But since the last frame took more than 16.7ms to render, it would be presented late. The first frame would be presented twice while the longer frame is still rendering.

That's basically what frame-pacing/ frame time is.

So you want a nice and smooth frame-time graph of 16.7 ms or lower.

crysis did pushed the envelope because they knew it had to be done
gameworks just fucks with amd cards and nothing more

people already compared hbao+ with vxao and they literally find nothing better on vxao that is almost 8 years newer filter..........

Thanks, will give that a try

Oh yes, you are an expert.
They are equivalent cards, is this hard for you to accept?
youtube.com/watch?v=vSDQzlKDYq4

Your driver overhead is a meme.

It'll also be cheaper since I won't have to rent a site.

Thank u. For the viewer, is this what stuttering is?

Your entire argument betrays you, AMD expend 5 years downplaying tessellation and now that they released a architecture overhaul with improved tessellation performance they are even advertising it as a feature.

As much as I prefer AMD cards, this is why people buy NVIDIA.

They nail the little things even if their card is a bit slower.

Oh my god you are a pajeet, your english betrays you.
How much do you get paid for this? Does is upset you knowing you are not influencing anyone?
Tessellation is a meme, especially at x64 levels. Totally unnecessary.

>i7 4790k

>he thinks goyim will be matching their budget GPU with a $400 i7

Does the rx480 suffer from it too?

Dude, what?
Tessellation is progress.

>I7 4790k

Welp there goes your argument out the window.

yes what you would have done when the competitor just uses tessellation as a weapon
its like that they say "oh we are dx12.1 cards" like people are so stupid that they dont know that they barely can go dx12 let alone having a full dx12 card..

>is this what stuttering is?

Yes.

Average frame-rate, aka "fps," is just that, an average and will hide (not quantify) stuttering.

Think of fame-time as the instantaneous frame-rate.

ALL stuttering will show up on a frame-time graph.

>Tessellation is a meme, especially at x64 levels. Totally unnecessary.

'64x' tessellation is just 16x with 4x MSAA. Not unnecessary at all if you care about your game not being a jaggy mess, though perhaps AMD cucks perfer a degraded experience.

N/v/idiot shills belong back on Sup Forums

x64 on nvidia cards doesnt mean true x64...remember nvidia cards till today they could cull those pixels in a way it was like a true x32 for them but on amd a x64 was an x64 because they didnt had any tech (a stupid decision from the former ceo)to cull the triangles

Thank you

970 and 390 were midrange card that anyone that bought could afford a 4790k.
But here is an i5 for you poorfags, still keeping up perfectly:
youtube.com/watch?v=04fRG5UixFM
It is literally useless at the x64 levels Nvidia is pushing

AMD Firepro Q9100 32GB is best graphics card.

No doubt.

see

fuck, typo.
W9100

I can't tell if you're actually thick or just pretending. Nvidia cards support dx_1 and if you were educated enough you'd realise that dx12 is build on different tiers. Dx11_3 and dx12_0 require async support which is why amd 300 series is classed as a dx12_0 support. The maxwell cards which support dx12_1 support async (to a shitter extent but support is still there somewhat) and also conservative rasterization (which amd 300 series doesn't support). So maxwell does in actual fact fully support up to dx12_1.

Lots of people are still using i5 3470 and earlier.

You don't upgrade to the latest cpu every year...

>thinks everyone is using i5

Check the steam hardware survey. Everyone uses 970 with shitty cpus most of the time.

Quit bitching about driver performance when your cpu is 20% behind anything on the market.

Here is with an i5, it still stands

So people that buy shitty cpu's buy shitty 970's? Got it.

Writing a driver to accept Async commands but ignore them isn't quite "support."

At the hardware level, or just a software implementation, or hybrid?

Why? I won't have this problem with nvidia, why should AMD be allowed to do this?

I'm actually on an i7 870 which is plenty for me, as long as I don't use AMD.

You would have more of a problem with the horrible driver overhead Hairworks takes on your shitty cpu. Complete denial over the trure reasons of your poor performance: old ass cpu.

lets be clear here
dx11.3 was the last dx version of dx11
dx12 is dx11.3 with literally the mantle
dx12.1 NEEDS to have a full compliance over dx12 because it doesnt ADD anything new it just expands the dx12 envelope to full heaps

the maxwell cards never supported async the cuda cores do but if nvidia activated them they would have a hard time explaining as to why their power usage would skyrocket by using the hardware sc they have for the cuda core..
yes infact but dx12 isnt only rasterizers is it now(which they do have on tier 2 only) and since they dont have any pixel shading even on pascal nor do they have any dynamic core to use resource heaps id say they are far from being called even a full dx12 card.. let alone a dx12.1....
not to mention that the whole dx12.1 started right after august 2015 when people shittied on them because of the async fiasco

Then disable hairworks?
It's so simple.

youtu.be/XoKu0_2ozAc?t=18

Why don't you fuck off with your old as 770 slav videos. I thought is was a 390 970 showcase here?

bu but hairworks is da future
it gives outstanind hair!
ITS THE FUTURE
literally 3dfx had something called boobjobs back then and they didnt even cared to have it on their boxes as a feature..
and now we get pubic hairs

Why does it kill performance on Nvidia cards, also? Driver overhead?

Are you fucking retarded or what.
Even if you ignore the 770 you still have Mantle vs DX11 on the AMD side.

You can see the 390 DX11 getting an average of %60 GPU usage even a low settings.

I fail how to see how this is a problem when the 390 clearly is better even at it's 60% vs all the other competition.

Holy shit what?

muh driver overhead

It's gone from

>driver overhead doesn't exist
To
>driver overhead doesn't matter

Lmao these fucking retards

youtube.com/watch?v=S546TL2LWNY&feature=youtu.be

>I7 4790k

Another opinion thrown out the window.

Idgi

failure to detect sarcasm is a common characterization of autism.

I'm unsure of the point you are trying to make. Is it:

Dont buy a slow cpu

or

Dont use an old cpu

and expect to have optimal performance?

>gtx 970 average is below its minimum
wewlad

or don't buy AMD and you won't have to upgrade your CPU.

because gameworks affects nvidia less than affects amd
look at project cars the dev knew this is going to happen and yet he used a version of physx that
1) you couldnt disable it
2) was an older one that was running on only the first thread of the cpu
3) he offloaded everything conserning the filters of gameworks into the cpu while you had a amd card
he literally made it unplayable a 950 was passing a 390 and he thought it was fine and when we called him out he started to ban our accounts and gives us refunds
i dont want a future that we will choose a game depending on our gpu...this would be VERY BAD for everyone..

No one that has a fairly recent cpu will get horrible performance out of any gpu they buy today. This is really a non-issue, unless you want to run at ultra with all the eye candy. Quit the dramatics.

Too bad, this is the path AMD is going with DX12 they are trying to push their hardware on everyone and from now own you're going to get games that run fine on AMD but worse on Nvidia, fine on Nvidia but worse on AMD.

Welcome to AMD's vision of PC gaming.

Don't buy amd at all is the solution. I can run an nvidia card on a 5 year old cpu just fine without getting bottlenecked in cpu intensive scenes like because of amds driver overhead. If I had a 2500k right now and a 390 I would be getting fluctuating frame times compared to if I had a 970 in cpu intensive games. Simple as that.

>i7 4770k
>Not recent


Simply impossible to have a discussion with you, you lack basic logic.

You don't blame skeevy game developers, you blame the hardware manufacturer that had nothing to do with it?

I don't care because I have a i7 3770k. AMD could be better but they're not as terrible as people make them out to be. Nvidia is jewing everyone hard and they need to be stopped.

a feature that preemptively discards useless tesselation isn't exactly flattering the awesomess of tesselation.

Impossible to have a discussion with you when you only:

>i7 4770k

and expect to make any scene at all. Do you want to show me something to make your point?

It would be perfectly playable for having a 6 year old cpu in it. More dramatics.

Didn't AMD introduce dedicated tesselators in the first place?

*muffled linking park plays on the background*

>i-it would b-be perfectly playable
>literally ignoring the solid evidence showing it's not prefectly playable

Neck yourself

I don't listen to Linkin Park. Or any kind of edgy music for that matter. Nvidia is charging too much money and is involved in too many scandalous business practices, such as forcing tessellation in heavy games to lower the performance of their previous gen cards, or straight up LYING about the specs on their GPUs. I'm not saying AMD is some benevolent corporation, they just got caught jewing out their reference cards with 6-pins instead of 8-pins, either to save money or to try to wave that "muh power consumption" ePeen, but they are nowhere near as bad as Nvidia, and their cards are fine. There needs to be a more healthy balance of market and mind share between the 2.

AMD isn't an option until driver overhead is resolved.

I get okay performance with hairworks fully enabled in tw3 on i5 750 and Gtx 980 and all settings maxed. Over 30 Fps!

I am glad you are poor and have to worry about one frame of one game with one card with one cpu to have this much anxiety over other peoples graphic card choice. The 390 is really the better all around performer to the 970. I'm sorry you can't enjoy things the way they are meant to be played.