Using nvidia

>using nvidia
>2016

Other urls found in this thread:

youtu.be/yTU8WbTbZMI?t=45s
youtu.be/j4PTf7LgsIE?t=8s
tomshardware.com/reviews/skylake-intel-core-i7-6700k-core-i5-6600k,4252-9.html
bytemedev.com/the-gtx-780-ti-sli-end-of-life-driver-performance-analysis/
techpowerup.com/225017/nvidia-geforce-gtx-1060-3gb-equipped-with-fewer-cuda-cores
itandtechnology.wordpress.com/2015/09/09/asynchronous-compute-amd-nvidia-and-dx12-what-we-know-so-far/
twitter.com/NSFWRedditGif

DELEET DIS

I still don't understand this.
Why would it be anything but legacy when it's not in production anymore?
It's still getting driver updates and performance enhancements, which is more than can be said about AMD legacy products.

Saged because this stupid shit doesn't deserve a bump

>nvidia
top kek people still buy this shit

Running on a 8600GT right now.
Feel me faggots.

>butthurt nvidiot cant accept his overpriced dumpster gpu doesnt get drivers anymore
>SAGE SAGE HIDE REEEEEEEEEEEE DELET THIS
umad

Year old drivers are better than no drivers, am I wrong?

Sorry im not a poorfag who was dumb enough to buy nvidia.

>nvidia
ahahahahahhahah

...

Legacy doesn't mean unsupported.

OWNED. AMDFAGS GETTING NO WHERE. AMDFAGS ON SUICIDE WATCH. #JELLYAMDNODRIVERS

yea it does

What does "legacy" mean? By AMD standards doesn't it mean that it is no longer getting driver support? What is the oldest family of cards that stopped getting support in Nvidia's mainline drivers, the 200 family of desktop cards and 300 family of mobile cards (plus the 405)?

Why do AMDrones keep bringing this up? Because they are tech ignorant?

>AMDrones
back to /reddit/

Would it make more sense to call ignorant fanboys of AMD "nvidiots?"

Hey guys this is a GPU thread and I'm gonna ask about my GPU. Is it okay to pair an i7 with a Radeon HD 5750 for a couple months? I'm waiting for the prices of the newest cards to settle.

>Radeon HD 5750
Aren't modern igpus more powerful?

>when nvidiots stim out because amd makes better cards
youtu.be/yTU8WbTbZMI?t=45s

I don't know, are they?

>not having 2x1080 watercooled in sli

fucking poorfags, kill yourself.

Ah. I understand. You're an AMDrone who projects his fanboy idiocy on others. Don't be a fool. People can hate idiotic fanboys of both brands. But feel free to shit up the thread with you nvidia hate. It'll only prove how idiotic AMDrones are as you aren't actually addressing anything in this thread with your pathetic trolling.

What does previous-generation mean to you? You think 900 series is current gen? Fuck outta here retard.

Gpu wars are cancer but why is it always amd fanboys that start these cancerous threads?

>not having 2x pascal titan watercooled in sli

fucking poorfag, kys

>tfw you cant shill nvidia cards because no one wants to buy outdated trash
youtu.be/j4PTf7LgsIE?t=8s

>amd fanboys that start these cancerous threads?
are you new here

>my brand loyalty makes my opinion the only one that matters
>anyone who doesn't prescribe to my fanboyism is a idiot and wrong.

you must be american.

Most AMD fags are, their international pricing is insane stupid

you must be autistic
see

buying what i want and not shilling anything doesn't make me autistic it makes me a decent human being, what's your excuse?

>no one replied to my post
>maybe if I link to it people will notice
;-)

>admitting to being a pajeet or a EU cuck
wew lad, remember the EU cuckold union isnt the whole world

Looks like the igpu in the i7-670/i5-6600 is about as good as a 4870.
tomshardware.com/reviews/skylake-intel-core-i7-6700k-core-i5-6600k,4252-9.html
And the 4870 was about 10-15% faster than a 5750 when it was launched.

>buying nvidia
>decent human being
fuck no now fuck off pajeet poo in loo

>yea it does
No, that's pretty much the opposite of what it means.

>anyone who isn't american is indian or european.

this is why your country is ranked 17th in education.

Are you? It's been like this for years. Amd fans are more cancerous than Applefags.

and yours is first in cuckolding

>Amd fans are more cancerous than Applefags.
literally only assblasted nvidiots think this now

most of the world prescribes to the idea of doing what you want as long as it's legal. buying a product flows with that, if you don't like what i buy get the fuck over it. you're not higher or mightier than me by being an annoying twat about the things you bought on a message board. i never even said shit about nvidia either so your assumptions to whatever side i lay on may or may not be wrong. the whole point is you're just perpetuating cancer, buy the card you like and shut the fuck up about it.

Stop coming to this site.

lick the darkest part of my asshole

Guys, you know the best way to deal with shit bait?
Stop replying.

Just watch as post quality goes up the less you reply to bait!

suck my american dick

Okay, thanks user. If I already own the card, should I put it in?

Wouldn't that depend on your iGPU? You might be downgrading if you install it.

shoulda gotta a 5770 and another 5770.

I owned a 290 up until 8 months ago you spastic

I bought it used for $10.

>1060 is better than 480
>anons on Sup Forums say I should still get the 480 because it "ages" better

what does this even mean? why wouldn't I get the better performing product? the 480 is worse in dx11 (by like 10-15% in some cases) and slightly better in the three dx12/vulcan games that are out. why would I buy a card for .5% of games out there? by the time dx12/vulcan games become more mainstream, the 480 will be too old to run them

i'm probably going to end up getting a 480 just so I can use freesync, but it fucking sucks. i'm knowingly buying a worse product because of nvidia's jewish tactics (gsync).

It's the i7-6700, non-k.

It seems like desperate speculation to justify a purchase that is known to be the worse of two choices. A baseless post purchase rationalization.

>worse
no

the 480 is better than the 1060

lol unusable frame rates.

>>anons on Sup Forums say I should still get the 480 because it "ages" better
>what does this even mean?

There are accusations of nvidia gimping their cards after new ones are released. That's probably what they're referring to.

480 > 1060

Looks like a downgrade and using more power.

...

Okay then. I can allocate RAM to the iGPU, correct?

Source? What reviewer uses a tech demo as a benchmark?

You're pretty much right in everything you say. Have this relative performance chart to ward off assmad fanboys who'll try and post cherry picked benchmarks of irrelevant games.

so I should get a 1060? the asus pg248q (gsync) is $450 compared to the viewsonic xg2401 (freesync) at $250. Both are within my budget but the freesync one seems to be nicer. it's hard to find (((gsync))) monitors that aren't insanely priced

cmon man, are you trying to convince me or yourself? there are like 3 or 4 games where the 480 squeaks ahead, the 1060 is solidly in the lead for 99% of games

is there any proof to this? I don't really follow hardware news but I have heard of the 3.5gb debacle

...

I don't think they changed that between haswell and skylake.

>that aren't insanely priced
Added cost of a hardware solution versus a software solution.

nVidia was used, you can get a $30 back because of it.

So, yes? I'm upgrading from a fucking Q6600. I just found out yesterday that some BIOS have cursors.

i'm willing to pay more for better features, but all the 1080p gsync monitors seem to have it as their only selling point.

>So, yes?
I'd say yes but I cannot guarantee it since I'm working off knowledge of haswell.

Gsync/Freesync aren't a have to have feature. It's nice to have when gaming in the 30-60fps region but no great loss without it.

>is there any proof to this?
I don't think there is any evidence, but I wouldn't be suprised if it were true.
Okay. Thanks for the help.

Pretty normal for Nvidia. Burying their old generation cards is common, it happened to Kepler, it's happening to Maxwell and it will happen to Pascal.

Is this meme still going around?

bytemedev.com/the-gtx-780-ti-sli-end-of-life-driver-performance-analysis/

>(You)

AMDrones are still perpetutaing the myth that Pascall doesn't support async. Do you honestly expect them to stop perpetuating this myth?

Well it's true that Pascal doesn't have a dedicated async compute engine like AMD, but it has better support than Maxwell via preemption and stuff

If only AMD has good compatibility with Japanese games, I'll stop buying Nvidia.

I have no choice.

YES, WHAT'S THE REASON?
I KEEP ASKING BUT NO ONE KNOWS

Fucking 3rd party 1060 = 3rd party rx 470
Shit is stupid.

There's no reason to buy AMD

>Well it's true that Pascal doesn't have a dedicated async compute engine like AMD
When did AMD cut out graphics from their compute units?

are all AMDrones this retarded?

>Previous Generation Products
>legacy
I don't see the problem.

Yes.

>RX480 is as fast as GTX1080
next week
>RX480 is as fast as GTX1070
next week
>RX480 is as fast as GTX1060
next week
>OC'd RX480 is as fast as GTX1060
next week
>OC'd RX480 with a water block is slightly faster than a stock GTX1060

techpowerup.com/225017/nvidia-geforce-gtx-1060-3gb-equipped-with-fewer-cuda-cores

>not even bothered to rename it 1050 because that would drive away customer

Legacy =/= EOL

Never? AMD ACEs are not dedicated async anymore than Nvidia's cores are dedicated async. Glad that was cleared up.

>spending $1200 on a single card

Snob pls

It was my understanding that the engine was separate from the actual gpu cores. If I'm wrong pls correct me

>It was my understanding that the engine was separate from the actual gpu cores.
It's taking me a while and I cannot find the exact article that goes in-depth on the AMD ACE process. But I found an article from last year which touches on it while discussing how AMD and pre-Pascal Nvidia handled async compute.

>The Asynchronous Command Engines in AMD’s GPUs (between 2-8 depending on which card you own) are capable of executing new workloads at latencies as low as a single cycle. A high-end AMD card has eight ACEs and each ACE has eight queues.
>It’s been suggested that AMD’s approach is more like Hyper-Threading, which allows the GPU to work on disparate compute and graphics workloads simultaneously without a loss of performance
It isn't a dedicated async compute engine as the ACEs handle both compute and graphics.

Since the article also touches on the myth Nvidia cannot into async.
>There were claims originally, that Nvidia GPUs wouldn’t even be able to execute async compute shaders in an async fashion at all, this myth was quickly debunked. What become clear, however, is that Nvidia GPUs preferred a much lighter load than AMD cards. At small loads, Nvidia GPUs would run circles around AMD cards. At high load, well, quite the opposite, up to the point where Nvidia GPUs took such a long time to process the workload that they triggered safeguards in Windows. Which caused Windows to pull the trigger and kill the driver, assuming that it got stuck.
itandtechnology.wordpress.com/2015/09/09/asynchronous-compute-amd-nvidia-and-dx12-what-we-know-so-far/

...

Just keeps getting better

>Full damage control
Blazing fast shilling, just like the house fires of nvidya-tards

reminder, gtx 1060 cant SLI

SLI and CF are already niche and likely to be dead in DX12 with explicit mGPU.

>AMD Drone believe Nvidia Cards don't get updates

Lol my GTX670 just received an update a month ago.

Enjoy your "top of the range" 4 years old processors and graphics cards AMD tards.

because it is ONE gen from current gen
if it was 3-4 gens nobody would care but 2 year old is not legacy

>no longer in production
>not legacy

See
And my gtx 680 is still getting drivers you dumb fuck

That's only AMD where legacy means unsupported :^)

Do these drivers even do anything? I mean my 770 is still getting "updates" but I didn't notice any improvement form xcom2 with gameready drivers or without. I know this is a very very small sample size. Has anybody done some tests if there is actually a benefit from updating older cards?

from what we've seen in the past they're actually slowly gimping old cards