Is it still true that, at least as it stands today...

Is it still true that, at least as it stands today, NVidia have objectively superior hardware than AMD as well as better QA and better, at least perceptually, drivers?

Other urls found in this thread:

amd.com/en-us/press-releases/Pages/amd-radeon-technology-2016nov15.aspx?sf42498801=1
vrworld.com/2016/05/23/amd-working-vr-bollywood-movie/
twitter.com/AnonBabble

Flipped on it's head.
AMD in theory has the better hardware, with more power behind it, but lacks the software support to see much real use outside of number crunching.

Meanwhile nVidia has paid CUDA's way into everything.

Overall, no.
While AMD truly does not have an answer to the 1080/70 at the moment, overall, AMD usually carries the advantage in terms of actual hardware on comparable devices. This is, however, offset by Nvidia's driver magic (and shady practices, depending on who you ask).

Where AMD does carry an advantage is FreeSync, which is arguably better than Gsync, and undeniably a better business model. AMD also has better business practices, overall, that benefit the industry, as a whole.

It comes down to whether or not you care about the way Nvidia does business, and, time between upgrades. Historically, AMD cards perform better over device lifetime, and there is (a small amount of) evidence that Nvidia may actively "gimp" obsolete cards. This said, Nvidia generally carries the performance advantage on the high end "latest and greatest".

>Meanwhile nVidia has paid CUDA's way into everything.
Memes aside they mostly didn't, it was released before OpenCL and they also opened the ptx assembler syntax which guaranteed some kind of portability between architectures also because of that even clang can compile CUDA code, AMD is just opening their assembly this year (probably next), ie almost 10 years later. It was about a better programming environment.

>Where AMD does carry an advantage is FreeSync, which is arguably better than Gsync, and undeniably a better business model. AMD also has better business practices, overall, that benefit the industry, as a whole.

No discussion here you're absolutely right

Eh, so please tell me why freesync is better than gsync, other than the price difference?

Also, in the professional world AMD has 0 chance.
Nvidia has established themselves there as a major player there.

And regarding consumer shit, AMD has nothing to counter the GTX1070/GTX1080.
And they won't until next year, however by that time Nvidia has already brought out refreshes.

Unless AMD makes some breakthrough engineering stuff, the sole fact that there is a huge difference between available resources should tell you all the story there is to it.

I will never be team red because I fell for the fx-8350 meme. AMD as a whole while may be a better company but produces the most inefficient pieces of shit ever.

Depending on when you fell for the 8350 meme, you may have dun good. If you fell for it > about 1&1/2 years ago, you'd have came out pleased

Because gsync uses special hardware in the screen to work, which means locking and price markup. Freesync works in software, so compatible with more stuff.

My bought it at the same time the intel G3258 came out. I overclocked the G3258 to 4.2ghz which is the boost clock of the 8350. In all of my real world tests the G3258 was perceivably better at a third of the price.

Were those real world tests playing Skyrim in 800x600 on lowest settings per chance?

Nvidia is in the best cases the same thing as AMD for more, in the worst cases you will get something like kepler ageing like milk, maxwell well on its way to being the same as kepler, an exploding 1080, 3gb in 2016-2017, 3.5, and no async on pascal. Thing is nvidia has way more spectacular fuck ups then AMD when it comes to gpus

nVidia has better hardware and software.
AMD does more open stuff so better for the community overall.

AMD also tend to overbuild their hardware to offset their driver/software issue as well as to boost their longelivity.

AMD has better software and hardware, but Nvidia does more for puppet-source community, so it's bettah.

they don't use special hardware.

they use an additional asic chip to measure latencies more efficient and accurate than the scaler which was built by the monitor manufacteurer to ensure a standaritised experience.

that's it. the rest in embedded in software aswell

i forgot to mention that there are companies like Eizo which do also create their scaler and scaler software their own to create an even better experience than g-sync, with the Eizo FS2735 for example.. but look at the price

>nVidia has better hardware and software.

Depends what you are doing - for some tasks AMD absolutely crushes Nvidia.

Eizo make some of the best screens money can buy and charge accordingly.

AMD have much better hardware than nvidia, they need it to brute force their way into performance competition despite the lack of polished drivers.

>despite the lack of polished drivers.

When they do get their together this happens.

Nvidia
>Makes top tier gaming cards with a great balance in price, performance, and efficiency
>quality software and innovations
>Researches cures to cancer and the creation of artificial intelligence

AMD
>Cheap gaymen cards with severe hardware and software issues
>makes Bollywood movies

>Researches cures to cancer and the creation of artificial intelligence

I think you mean Nvidia causes cancer - Sup Forums is living proof.

>cures cancer

I guess you don't know how the cancer "business" works.

Lol I dunno when this screenshot is from or what the current situation is, but shortly after release Dx12 actually performed worse on Nviia than Dx11 in that game.

>Its another "AMDrone pulls up a poorly optimized AMD branded game" episode

this would actually count if AMD had the funds to do this for more than 5 or 6 games

> Dx12 actually performed worse on Nviia than Dx11

Should've stopped there.

Gee willikers user - how does Nvidia's year newer cards continue to struggle to beat hawaii?

so, special hardware?

what about an asic chip is special. your gpu is an asic chip

>struggle
>literally wiping out anything AMD has in 99% of games

You do realise pascal is 3 generations newer than hawaii right? What you should be focusing on is the 980 getting beaten by a slightly angry 290 (aka 390).

that 390x needs to draw twice the power to get even near the 1070 :^)

So does a 780ti.

With my limited experience, nvidia has been the goto choice. Granted, I'm thinking in terms of linux compatibility, but I also got fucked over with the whole optimus fuckery. take that however you will.

Good thing i live in a country that can supply 2,400w to each individual wall socket for cheap and therefore don't give a single fuck

Give me 2x watercooled 390X and a fucking AX1500i

Optimus is the reason why Linus is an angry old man.

Optimus is why there's a bunch of us angry old men.

>He needs watercooling and a shit ton of power just for his Pajeet Poo-in-the-GPU to even be close to a Nvidia GPU which doesnt go past 80c on its stock cooler
>he wears this like a badge of honor

fucking AMDelusionals man, I swear
gets funnier every time

>2013 vs 2016

I bet you feel special when you beat up old ladies.

Says the faggot with his little 1.2L turbo

Us real men drive V8s

Have fun with all that heat :^)

Stop being poor.

>tfw when my msi 1070 doesn't even go past 60ยบ while maxing out Witcher 3

If wasn't for shittier color reproduction, reducing distant object detail in games for muh fps gains, gimping older hardware, a shittier control panel and over priced G-Sync I'd go with Nvidia. But they went full on kike since the 500 series. People that go with the same Nvidia card for more than 3 years are cucking themselves into oblivion.

meanwhile Pajeet needs to run his air conditioner and use a watercooling block in the middle of winter for his Poo-in-GPU to catch up.

>just upgrade every year goy!

A person's wealth has nothing to do with it - its about not being stupid.

Just because I could buy a million hamburgers doesn't make it a sensible decision.

It's okay, the amount of money you pay to power your AMD piece of shit has probably far exceeded the price of a 1070 by now

...

Fucking retard.

>Researches to cause cancer and the creation of botnet
ftfy

Joke all you want but Nvidia actually contributes to technology way more than AMD

most AMD has ever done is help VR Bollywood movies, Nvidia is involved with AI and Scientific research.

2kw air-con

>multicore isn't a contribution to technology
>64bit isn't a contribution to technology
>preventing monopolies isn't a contribution to technology
>all our modern video dram standards isn't a contribution to technology
>Unified shader architecture isn't a contribution to technology
>Every modern low level api isn't a contribution to technology
>Async isn't a contribution to technology
>HSA isn't a contribution to technology
>Graphics on a CPU isn't a contribution to technology
>Driving down the costs of computers isn't a contribution to technology
>Stacked ram for video cards isn't a contribution to technology
>Endorsing open source standards isn't a contribution to technology

Yea i guess AMD is just useless then

>closed source gimp code in vidya is important
>vidya gayme physics is important
>housefires is important
>vidya is important

No go- I mean guy, they are not.

Oy vey you want 4 cores, 2 is enough, why would you need any more than 1 core?

Why do you think Intel are releasing an unlocked i3? They want to crush zen before it makes a lot of their prices look like the ripoffs they are.

I was told years ago that AMD give you better value through the product range versus nvidia, until you get to the top end.
Is this true today?

Depends on the games you'd play.

Next year I'm upgrading from an i5 2500k, 8gb DDR3, and a RX480 to a Zen 8/16, 32GB DDR4 3000Mhrz and another RX480.

The best part is that my 600W PSU will manage everything.

I can only dream of the gains.

I've used both, nvidia gets updates daily, my current and card gets new drivers every year. They both have some exactly what Ive asked them to do flawlessly. Don't buy into elitist bullshit.

wew lad that seems very legit

Don't expect much.

you're a fucking retard or a troll
I have a [email protected]
but amd 8350 on an overclocking board with watercooling has damn near my performance, if only with much more heat
so if you live in the southwest and you're roasting yourself with it, I understand
but don't give us the shit that an 8350 is worse than a fucking g3258 because that's simply not true.
hell just look up the passmark overclock average benchmark of an 8350 I could see how it would be an honestly good deal if you live up north where it's cold a lot

My vishera.

Much over a sandy bridge? We are talking about 5-6 times the transistor count,a far better and modern architecture and not to mention double the memory speed and another GPU.

I'm going to do the same upgrade except i'm going full fat vega and a 4k monitor. I just hope my AM3+ cooler fits AM4 (no huge loss if it doesn't) as i'm going to crank that zen chip the fuck up. Damnit i've got a cooler capable of handling over 200w and i'm going to prove it.

I upgraded my GPU from a GTX580 to a RX480 and was seeing like a 40-50% increase. I knew something was wrong because the RX480 is almost 2.5-3 times better. Now I actually re-OCed my CPU from 3.4ghrz to 4.6ghrz and I saw an insane rise to performance. The GPU has been unlocked pretty much. I wonder what performance increase ill see with a Zen still.

>When they do get their together this happens.
>DX12
You mean when the game bypasses their crap driver in large part and is also specifically optimized for AMD hardware?

Raja claims 50% of gpu performance is on the software side and there is no way in a general sense GCN is 50% faster than kepler onwards.

That said, lol kepler.

Funny because even Intel Pentiums absolutely destroy AMDshit

and Zen is confirmed Bulldozer 2.0 with Sandy Bridge tier single core

>this is what Sup Forums actually believes

>integrated graphics
good one Pajeet
APoos are the only niche market AMD is good for, and Intel BTFOs them with Iris

Is it worth waiting fot PCIE 4.0? I have a Core 2 Duo for 10 years.

Iris is still behind the latest APUS. Plus consider iris is on chips with L3 cache (and L4 in this case) along with double the cpu IPC yet still lose.

Then again Intel is learning the hard way gpus aren't just cpus with loads of cores. Intel is woefully unprepared for the sort of software support required to actually make gaming viable on their platform.

> Iris
Same power for twice the price! What a deal!

Yes, but it's hardware by definition, pajeet.

...

Nvidia should get on the phone to jackie Sup Forums - they'd have decades worth of data in a month with the amount of cancer flowing through this place.

oh well good bye ngreedia
amd.com/en-us/press-releases/Pages/amd-radeon-technology-2016nov15.aspx?sf42498801=1

Hawaii stronk apparently.

Hawaii is what happens when brute force is applied to computer science.

BTFO

vrworld.com/2016/05/23/amd-working-vr-bollywood-movie/
POO
IN
LOO

>Machine Learning
>AyyMD

Google's gonna have hell of a time with that one
bet AMD sold those pieces of shit for bargain prices

no benchmarks for R9 380 non x

No sympathy fot the tripfag.

>GTX 1070, Nvidia's midrange card, destroying all of AMD's high end offerings with no problem at only 150w of power usage

what's the problem here?

>550$ in most countries outside the US
>midrange card

>great balance in price, performance, and efficiency
>price

No.

This thread is dead. Full of Nvidia apologists.

Of course it is, retard.

With AMD you pay the price through wasted time and power bills

Also having to get a better PSU+Mobo than you really need to make sure it doesnt fry your computer

AMD is superior in every way because they care at least a little bit about freedom