>AMD cards aged better than nvidia
No, CPUs just got stronger, helping alleviate the horrible driver overhead on AMDs side.
There's a reason why HD7970 preforms so much better on an i7 6700k than it did on the i7 2600k.
>AMD cards aged better than nvidia
No, CPUs just got stronger, helping alleviate the horrible driver overhead on AMDs side.
There's a reason why HD7970 preforms so much better on an i7 6700k than it did on the i7 2600k.
Other urls found in this thread:
youtu.be
computerbase.de
guru3d.com
anandtech.com
anandtech.com
babeltechreviews.com
youtu.be
techpowerup.com
twitter.com
>amdcucks on suicide watch
All this tells me is that AMD needs to steal nvidias driver team since their pus are just straight up better than nvidias.
Why do you think GCN is seeing such a performance gain on vulkan and dx12.
somehow they're still behind nvidia on weaker hardware
You have 10 seconds to delete this.
I love this line of reasoning because it doesn't actually change the fact that AMD performance improves over time.
It doesn't matter when it's worse than old gpus anyway.
Yeah because everyone with an AMD card is using the latest i7.
When you buy AMD, you won't even get close to the performance you see in benchmarks, even a haswel i5 is a bottleneck
>buy new cpu instead of gpu to bypass amd incompetence
amdornes everyone
>Be AMD fag
>buy the latest $300 i7 every 2 years instead of a $300 GPU every 4 years
No wonder they're so poor
I got a 4770k in 2013 that I won't need to upgrade. Where was it shown that a 2600k is worse than a 6700k like in OP's pic? It isn't in the link.
>he thinks an i7 2600k is just as good as an i7 6700k
When it comes to games, yeah pretty much. 6700k is better but you can close the gap quite significantly with OC.
>showing an nvidia only benchmatk
I guarantee the result will be very different with an AMD GPU.
Well I was kinda hoping someone would actually post something similar with an AMD GPU which is what I was asking for. I think Digitalfoundry also did something similar with a 290x but I can't find it. I don't remember the results being that off.
Yeah just had a look, they're all either 970, 980 ti, or Titan x ffs.
This game is very GPU intensive, that Titan X is already at max GPU utilization, not even an OC GTX1070 can maintain consistent 60FPS at 1920x1080 max settings on this game.
This is a poor example and even Digital Foundry mentions it.
You're free to look at the other results. I just had the pic off hand from another thread. The gap is small except for a few outliers.
There's other results too with the Titan x showing much better results.
That being said, it should be common knowledge that GPU bound games, like bf4 for example, will be affected less by cpu than CPU bound titles like GTA v or Fallout 4
I know this is off topic slightly but what kind of performance can we be expecting from vega? Amd released the 480 before the 1060 which gave nvidia some time to refine the card to make it beat the 480. Now that the 1070 and 1080 are already out, should we be expecting faster performance due to amd's ability to change up clock speeds etc to beat the higher end nvidia cards?
This is just my opinion but I really can't see amd suddenly pulling out a couple of cards which are faster than the 1070/80 like nvidia did with the 1060. Not unless it's 3xx series levels of heat and power consumption.
I think OP is just pulling shit out of his ass to be honest family
computerbase.de
GTX 1080 at best.
so if i have a 4770k, i'm better off with nvidia?
>No, CPUs just got stronger, helping alleviate the horrible driver overhead on AMDs side.
In other words they're better now than they were. As in, they're aging better. Deal with it Nvidiot.
>dumb amdrone
Nicely done.
Holy shit you're reaching a new level of autism.
I just wasted 5 seconds of my life to find out just how CPU agnostic this game is, in which hilariously a gtx960 is not only ahead of the 380x, but only 4 frames behind a 390.
The only one pulling shit out of his ass is you, and you're reaching in deep because you're running out of shit to post.
guru3d.com
>implying AMDumbs are rich enough to keep updating to the latest i7
So provide some proof of your theory. Post a comparison between the 7970 running on a 2600K and a 6700K at the same clock speed. You've been spamming these same handful of images here for literally weeks and have added precisely nothing to your shitposting in that time.
I can't tell if you're legit retarded or just trolling.
The point being made here is that whenever older amd gpus are being tested to see potential performance gains, it's always with the latest i7 the reviewers have so there's little change of cpu bottlenecking. In reality it's not "aging better" but cpus are becoming stronger and stronger which help out with amds major driver overhead. If you want to see this "aging better" you're gonna have to buy the latest and greatest i7 every year.
lamo, that was just the first graph that came up. There's others there, did you even look?
I'm honestly startled how it seems you've managed to get an aneurysm over something I just came across and decided to post. I honestly don't care that much.
>I guarantee the result will be very different with an AMD GPU.
spoiler: no significant difference
anandtech.com
anandtech.com
OP BTFO
You shouldn't be trusting computerbase anyway. They're unreliable.
So wait a fucking minute, an i7 really is best for gayming?
/g is so full of shit
...
>literally posting every cpu agnostic title he can find
PretendingToBeRetarded.png
Only if you're runnung ayymdpoor gpu.
Oh fuck off, I can go and grab a screenshot that shows +25fps on that same video.
Why do you have to resort to lies and half-truths?
This is an interesting read if you're interested.
There's only 4 to chose from bro. This is the one with the widest gap if it makes you happy.
Givin me a giggle how invested you are into this theory of yours.
Resorting to insults is pathetic
>amdrones must get either amdpoor gpu+nvidia gpu or intel cpu+amdpoor gpu if they want playable fps
>retards running ayymdpoor cpu+gpu are literal cucks
That isn't interesting at all. Correlation does not imply causation. I asked you to provide evidence of a test carried out on two different CPUs back to back or side to side, not one set of results from 2012 and one from 2016, with you reading into them whatever suits your argument. Not an article which has nothing to do with the subject at hand, other than you putting your own slant on its results to suit your argument.
You do realise your image is showing a bigger performance loss on the AMD cards than the nvidia cards, thereby proving op right, don't you?
in general no but it works wonders on cpu hogging games (fallout 4 comes to mind)
Pretty much, especially for cpu intensive games, or if you're using an AMD card.
They are still very playable, and you won't even notice for crisakes.
Is it really surprising an i7 is a stronger cpu to you?
Do you expect amdfags to be able to read graphs? The same who post that tom slide showing 960strix having 50W average draw and 480 80W.
So would an i5-4690 be bottlenecked by an AMD card?
I have a GTX 770 and I've been thinking of snagging an RX 480. I've never really thought about CPUs before. My 4690 doesn't seem to run into problems.
when you go on the offensive, you show your hand
you do not feel secure with the choices you've made
Oh sure but there's other factors like 4 threads vs 8 threads. Maybe it's not that CPUs are "better" but AMD using more cores. Like I said I just came across it. I really have no beef because I buy parts I need for work and that will almost always land me on buying a Intel CPU and an AMD GPU for other factors like rendering.
>muh 5 fps
You wouldn't even notice in real life
These graphs are deceiving. It makes you think theres no difference between a 5 year old i5 and a recent i7 when there's a huge gap especially in cpu intensive scenes by upto 30 fps.
This video shows this. It's almost the same when the cpu isn't being stressed but when they go to the road with all the cpu heavy effects then the i5 dips below 60 fps whereas the i7 is at like 90 fps. Don't forget this game is very optimized for all types of cpu too.
The people at computerbase are probably too stupid to test cpu heavy scenes in games which would show a big difference. Most cpu perform roughly the same in most games anyway unless, like I said, it's cpu intensive scenes. Even an 8350 performs just as well as an i5 in games that aren't taxing on cpu.
The point is amd depends on cpu performance whild nvidia doesn't. People upgrading their gpu are getting better performance with new nvidia card than amd.
Do you think you will ever get rid of bottlenecks?
You have an older locked cpu.
Of course it won't get the same frame rates as an 6700k or whatever.
Oh and the main thing I'm questioning is that AMD hasn't improved their drivers. OP provides no proof of that and claims it's just CPUs being the only factor. AMD for sure has more driver overhead than Nvidia.
If AMD is supposed to be the "budget" brand, why are they so underwhelming on budget hardware?
For what? >5%
And I get to pay more to nvidia for that honor?
So a newer cpu performs better than an old one?
Is that the whole point of this thread?
Shut up nvidiot.
Literally a Google Search away, people have done tests with nvidia and AMD cards on weak as shit CPUs.
>So i asked for a friend of mine to lend me its GTX 770 in order to benchmark, it was not the best decision since the GTX 770 is far more powerful than my HD 7850, but with MSI Afterburner it was easy to see when my GPU is getting bottlenecked, so thats why i choose saints row IV, project cars and dying light and Star swarm, since they some of the games that my GPU usage are always far away from 99% which means i am bottlenecked by something.
techpowerup.com
>If AMD is supposed to be the "budget" brand
they're not tho. look at last wave of cards
they don't compete on the ultra-high end market like the gtx 980ti, but they were still good in the upper bracket of tier of cards. the r9 390 competed very well against the gtx 970. i don't think it's right to consider cards in the 300-400 dollar range as budget.
underwhelming?
Each and every graph posted still has fps above 60fps, and below 144fps.
Are you so sure multitudes of players are going to notice the 8fps or whatever difference?
>buy a budget cpu, get budget cpu performance
Rocket medicine, folks
>buy a budget cpu
>buy budget nvidia gpu
>get advertised performance
>buy budget cpu
>buy budget AMD gpu
>too bad, should have gotten an i7 extreme Edition goy, what are you, poor?
>tfw on 2500k
i actually overclocked it at last after five years of usage
Yes Goy, buy this Nvidia card for only 20% more!
The 2% difference in frame rates will make your gaming experience unbearable!
Buy Nvidia, The way your meant to be played!
>no arguments
dumb nvidiot
Bullshit I get 5500 in firestrike on my AMD card same as most anyone else. My results are aftually in the top 20% for legit benches and i use an 8 year old cpu.
It's sad that redditfags are smarter than AMDumbs...
Literally nothing wrong with AMD gpu's.
They are essentially equal in performance.
It's why you shills have to resort to making a 8fps difference a huge deal.
OH MY GOD 72fps vs 79fps AMD IS DEAD
You realize you sound ridiculous, right?
OH MY GOD -7% IN SOME RANDOM GRAPH NOBODY WILL NOTICE
>gpu only benchmark
>has nothing to do with cpu
Every time I press pause the i5 has a faster fps.
79-72 != 8 you retard
Actually every time I press pause there is a difference of 1-2 fps between the i5 and i7 so not much difference in the Intel scores. The FX is shit though.
holy fuck amdcucks have been literally blown the fuck out
also im going to pretty much die of laughter when amd plummets again after so many plebs bought stock @ $5.00
>retarded nvidiot
it's even more sad just how much AMD fucked over the nano/fury launches 2bh
They could have at least dropped it by 50mhz and undervolted to make it more appealing from a power consumption/thermal standpoint.
>8 fps
DELETE
Yes, point out the glaring error and ignore my question:
Do you really think 5fps or whatever is going to make game play vastly improved?
I don't think anyone's going to notice a difference outside these charts.
And anyone that would notice something like that already has the best equipment possible.
Only thing I have learned is:
i7 > i5
old cpu < new cpu
when will we get some charts from a consumers point of view
>start at 8
>drop to 7
>now at 5
No, the -69fps difference won't matter at all!!!
Only a retard would buy a 6300, this proves nothing.
They're really just making weird choices all around quite honestly. It's as if they just don't understand what the fuck people want, and they've talked themselves out of trying to appeal to normfags.
If that 8, 7, 5, or -69 fps Is The difference between maintaining a smooth 60fps or not then yes I'm sure a lot of people would be bothered.
>nvidia works better than amd even with amd cpus
CAN'T MAKE THIS SHIT UP
What is it with AMD and their 980 comparisons. Wouldn't it benefit you to just go the Nvidia route and say "this shit is 300x faster than our previous line" instead of just getting fucked through all these benchmarks?
How is it that AMD gpus suck so fucking much with cpus that aren't 600 dollars?
According to OPs logic shouldnt old Nvidea cards also age better?
The graph OP pictured has many combinations if you will look closely. I am also talking about all the other graphs posted in this entire thread.
I am really unconvinced this will make any real world difference. You still have not really convinced me such a insignificant figure is worth all the hate. It's like there is a YUGE performance hit from all the shilling. In reality, its really nothing.
So:
i7 > i5
old cpu < new cpu
really surprising to you?
6300 was actually smashing value as long as you didn't use radeon graphics
The 980 is THE price to performance card of the last gen
lol based. op kys deceitful fag
>970 loses 13fps going from 6700k to 9590
>390 loses 28 fps going from 6700k to 9590
No matter how you look at it, that's fucked up. It's more than double the performance loss on AMDs side compared to nvidia.
They work fine nvidiot.
Are you so blinded you didn't notice a $200 card was damn near as good as Nvidia's $650 card six months ago?
A 6300? It's old, and it's shit. Is this surprising?