AMD is shit

Why is AMD so shit? Pic related.

Other urls found in this thread:

tomshardware.com/reviews/amd-radeon-rx-470,4703-8.html
twitter.com/NSFWRedditGif

They focus on meme technologies like Async and Vulkan instead of proper support for shit people actually use like OpenGL

Situation will reverse later this year when DX12 shits on nvidia cards

And then Nvidia develops DX12 compatible cards that destroy AMD's, and people buy them because they arent foodstamps poor like AMDrones

Literally this,
Why the fuck didn't they develop their architecture for OpenGL

Vicious cycle of lack of money means lack of resources for R&D and software support.

because amd doesn't know how to optimize their dx11 drivers for shit

How much more actual FPS though, not just %?

Because AMD a shit.

>AMD is yet again pushing progression
>Let me try and spin it in a negative light

I love how exaggerated this graph is and how nobody seems to notice.

Great job nvidia shilling team!

>AMD shit
>Sup Forums post AMD hate thread every day


I think Sup Forums is just mad cause AMD is making a come back.

I know right?

Nvidia is better with vidya oriented architectures.

Who cares for compute and async lmao

>technological progress is a meme
nvidiots everyone

its the Sup Forums spillage that comes here to shitpost that hates AMD
Sup Forums has always liked AMD

>buy amd card
>costs 50 dollars less
>comes with more RAM than advertised
>Improves performance over time
>makes money passively, ultimately paying for itself
>Loses by a couple frames in unoptimized DX11 shit to cards more expensive than it, but the difference isnt really noticable
>Crushes the more expensive card in newer games.

meanwhile
>buy nvidia
>pay more to Play the Way its Meant to Be Played(tm)
>has less ram than advertised because "Th..that was extra goy!"
>performance falls off a cliff once a new card comes out
>card just costs money, never makes money
>wins buy a couple frames by texture streaming and gimpwerks(tm) implementation The Way it Was Meant to Be Played(tm)
>gives the ultimate SlideShow Experience(tm) with modern technology.

so hard to choose

The 1060 already beats the 480 in DX12. Things will only get worse for AMD from here on out.

The OP image says you're full of shit.

>Hardware unboxed
Nvidia shills

>AMD shot their load too early
>amd cuckolds still defend it

Why would they do this?

>buy amd card
no one has seen one in stock in a month after release, even with AIB reviews coming out still no AIB's have come out.

Is it normal for a 280x to often require me to run medium-graphics games in a smaller window rather than fullscreen if I want perfectly smooth performance?

Graphics of about this quality is what I'm talking about. Still have special effects, but no exceptional texture or mesh resolutions.

...

There's a $50-$100 price difference

>its the Sup Forums spillage that comes here to shitpost that hates AMD

Not that both intell and nVidia are known for Jewish tactics at all.

Eat a dick

Meaningless numbers, literally mixing pears and apples. You can't simply compare percentages that don't take the same base value for 100%.

>20 vs 25fps
>25% FASTER!!!!

>100 vs 105fps
>5% faster

The 1060 still wins in most games, but the graph is misleading and useless for measuring how faster it is.

Just like DX10 swept the world, and openGL reigned.

>Intel Compiler causes programs to run 47% faster for Intel CPUs, even if it's a random CPU disguised as Intel
So a huge amount of software has had its performance bricked for a huge portion of the computers out there, and AMD has suffered probably billions in losses from the benchmarks since a small difference in performance can mean a big change in opinion, let alone big difference in performance.

How has Intel not been sued for like, ten billion dollars or something?

20 vs 25 being 25% faster is legitimate though.
If it takes a hundred million computations for a frame for a slow game and ten million for a fast game, then if something runs at 6FPS on the slow game and 60FPS on the fast game it's clearly able to do 600 million computations a second.

If something gets 9FPS on the slow game and 90FPS on the fast game, then it's clearly able to do 900 million computations a second.

And no matter what things you compare; computations a second, slow FPS, fast FPS; you get the answer that the second GPU is 50% faster.

>comes with more RAM than advertised
This doesn't happen unless it's some limited rebranding scheme to pretend a card exist.
(1 Nvidia partner has done that too in the past, GTX 465 = GTX 470)

4GB RX 480's are 8GB RX 480's with 4GB disabled, but flashing an 8GB BIOS enables the other 4GB.

Intel had to pay a birrion dorrar to AMDa few years ago for this and other anticompetitive practices.

Uhh the numbers are literally exactly as they state. It's not misleading in the least bit.

Notice how the 1060 even wins in Ashes DX12, an AMD favored game.

Not even in the graph is Tomb Raider (DX12) and Time Spy (DX12) where the 1060 also wins.

1060 is much better than the 480 in DX12...and everything. It's just a much faster card.

>Tomb Raider (DX12) and Time Spy (DX12)
>DX12
lol no they dont

They probably flashed some of the 8GB RX480's with 4GB bios because they wanted to avoid bait and switch accusations. Because that's pretty much what's happening with these "$199" cards.

Here's the 1060 easily beating the 480 in Tomb Raider (DX12)

Here's the 1060 easily beating the 480 in Time Spy (DX12)

That's exactly what I was talking about. Good luck with that one if they ever start producing the 4gb variant.

Nah, it's more that they just had one production line to reduce costs.
For some reason, 4GB of VRAM costs less than a new production line to spit out 4GB cards

>Use experimental technology no one gives a fuck about yet instead of making proper support for shit people actually use
>AMDrones try to spin this off in a positive light
Meanwhile Nvidia has proper Linux support, DX11 support, and OpenGL support. Shit that in some cases has a massive performance decrease for AMD because their Rajeets can't code.

Because AMD really sucks at creating reference coolers.
The sapphire nitro or powercolor devil are on par with the 1060.

Nice reading comprehension illiterate nvidiot

>in Time Spy (DX12)

>Tomb Raider
>(DX12)

DELET

>And then Nvidia develops DX12 compatible cards
they would have to completely redo their architecture and change or replace CUDA cores.

Nvidia already easily beats AMD in DX12

nice meme

hardware unboxed are shills user find a different source

meme? it's the truth, look at the charts above

the 10 series is very good at dx12, far better than anything from amd

here's your (You)

0.02$ have been deposited in your account

Yes it's a dumb shit but:
amd < nvidia < intel < arm

Sorry you can't handle the truth

miners have been buying hundreds of them
the 480 is the best card for it

Nice P/W there AMD

Lmao

DEL

Here's the RX 480 beating it in value.

What happenend ?

Rx490 when

And here's the 1060 easily beating it in value

The 1060 is by far the best value card you can possibly buy

This makes sense. That's why I can't find problems anywhere regarding AMD's open drivers and many problems with Nvidia's.

My guess is that you can't make Linux shill that much for a particular hardware vendor since the kernel just adapts to whatever is on the market according to everyone's needs.

That's an old graph. Mine is newer :P

And this is the newest.

Well in my coutry they cost roughtly same.
And the RX480 is reference.

Why would anyone buy AMD is beyond me. They fuck up everything. Performance, drivers, power consumption, marketing, pricing.

>Why would anyone buy AMD is beyond me. They fuck up everything. Performance, drivers, power consumption, marketing, pricing.
because you are an nvidia fanboy

not an argument

Also I'm posting from laptop without AMD or nVidia crap. Get fucked.

>DX11, OpenGL, and Linux are all Sup Forums and Sup Forums

Hmmm, it's almost like AMDfags are also the same Microsoft Pajeets who also shitpost anti-Linux threads along with their DX12 cocksucking...

kys nvidiot

even TPU clearly shows AMD wrecking and they're notorious nvidia shills

>Linux
have you even used AMDGPU lately or are you the kind of moron that spouts hearsay from two years ago?

Nvidia has official Linux support miles beyond AMD.

AMD Pajeets can't even support common APIs yet brag about how they can copy paste Vulkan and DX12 code cause its "next gen"

Because they're trying to compete with literally a tenth of the budget of the competition. Everyone knows (unless you're a brain dead shill) that intel and nvidia make the best products concerning cpu and gpu but amd are just there for market competition because intel and nvidia want them there. Amd used to have the "we are cheaper!" Argument but now their stuff is the same price as nvidia or even more expensive. Board partner 1060 is £229 but board partner 480 is £250.

Someone seems to have forgotten some history, such as the original 480 house fire, 196.75 drivers literally burning cards up, and the 4870 vs the 280.

performance/dollar, price/performance, cost/frame are all fucking retarded metrics. Graphics cards aren't expensive enough to justify pinching pennies on them.

I'm pretty sure the only reason that low-end cards only sell for like 30 bucks less than entry level while delivering 1/4 the power is because if they didn't, poorfags would try and quad-SLI them because it's such a good deal! And then complain when it wasn't as good as a gtx x80.

DESU I dont trust benchmarks *at all* anymore because sites only bench at ultra settings, and a lot of those settings are crippling for AMD moreso than NVIDIA

if you did a *high* settings comparison it would be a lot closer, especially in games like witcher with HBAO+ and stuff

with a 290x clocked at 1100 I can play at 1440p high settings averaging 55 fps. according to benchmarks that would be my framerate at 1080p

I don't know what you're talking about; AMD has had fantastic drivers for almost two years straight and NVIDIA has had almost double the software fuck ups, as well as the 970 meme scheme in recent memory

>Nvidia has official Linux support miles beyond AMD.
nvidia has a fuckhuge binary blob that's only marginally better at openGL perf, and who knows for how long. the nvidia-settings tool is fucking shit and includes a fourth of the features you get on windows. plus if you want to do anything multi-monitor, you need to manually save their generated Xorg.conf that fucking breaks most of the time. it also takes them months to fix anything like not being able to output video on monitors with corrupt or missing EDID.

t. disgruntled 970 owner

meanwhile amd's drivers are foss, built into the kernel and are constantly updated with more than just opengl optimizations

stop living in 2010

>if I turn down the settings I get better performance
>this is because of nvidia crippling me

Are you trolling or really this dense in the head?

you have to be completely retarded to misinterpret what I said

what I said is the the performance difference increases between nvidia and amd going from high to ultra settings in most games, especially anything that involves tessellation or nvidia favored AA or AO

if you were to run the same cards at reasonable settings the difference wouldn't be so high

The combined effort of AMD supplying shit proprietary drivers, AMD supporting open source drivers and Intel bankrolling mesa development is finally paying off. Soon AMD will pull the plug on their proprietary OpenGL stack and replace it with mesa even on Windows.

its the poor mans brand m8

AMD is doing the smart thing by keeping a common code base for their GPU drivers on both Linux and Windows.

To be fair, that's probably Nvidia's strategy too, but who the fuck knows.

Why would you want Mesa for Windows? It would be a lot of work, take a long time, cause regressions, and when it's stable, GL is already well into the sundown phase and only legacy code uses it.

nVidia definitely shares code across the platforms. I've seen the same driver bugs on both platforms which kinda indicates that.

>let's play at lower settings so it gives amd more of a chance!
>let's not push each card as far as they can with the highest settings!

So you really are dense in the head then. That's not how benchmarks work. Reviewers won't purposely benchmark at lower settings for stupid reasons like yours.

>most games

So you're saying nvidia has gameworks in "most games" which is what causes this? Gameworks is in a very select few games and not "most games" like you're trying to imply. If there is a bigger regression between ultra and high on amd cards in non gameworks games that's solely amds problem for creating shit hardware and also your fault for buying said shit hardware.

>>let's not push each card as far as they can with the highest settings!
that's not what I'm saying
what I'm saying is very few people, and mostly stupid people, play games at everything on ultra settings

what I'm saying is there should be more benchmarks of games at different settings for all cards, because """"""Ultra""""" rarely means that, and very many games have crippling settings that should be disabled on most cards
>>let's not push each card as far as they can with the highest settings!
no, I'm saying that lower settings and higher resolutions would genuinely push the GPU capabilities rather than being crippled by unoptimized gimmick effects

Why not? Mesa is already better or equal to AMDs own GL stack in many cases and MIT licensed. Maintaining a proprietary stack for sundown tech doesn't seem too great if you could just use an open stack with other partners paying for part of the development.

>very few people play on ultra settings
>you're stupid if you play on ultra settings for some unknown reason

HAHAHAHAHAHAHAHAHAHAHAHAHAHAHA

Get a load of this chump. If I play bf4 or even a game like witcher 3 on ultra settings I'm stupid even though my gpu is perfectly capable at the resolution I play at. Go back to you colossal retard.

yeah, you are dumb for leaving foliage distance on ultra instead of high. I bet you don't even have a great card; I bet you have a 970 and play it at "ultra" 1080p instead of turning some settings down to high and downsampling for a sharper image

...

...

...

also, you don't buy a midrange gpu to play low resolutions on ultra settings unless you are extremely retarded

...

...

...

...

and as tom puts it:

tomshardware.com/reviews/amd-radeon-rx-470,4703-8.html

>Its GTX 1060 looks especially good in DX11-based games. But an increasing emphasis on DX12 and Vulkan support does the $250 card no favors.