I have jumped ship from AMD to NVIDIA

I have jumped ship from AMD to NVIDIA.
No regrets.

Have i done you proud Sup Forums?

Other urls found in this thread:

anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/9
anandtech.com/show/10486/futuremark-releases-3dmark-time-spy-directx12-benchmark
nowinstock.net/computers/videocards/nvidia/gtx1070/
twitter.com/NSFWRedditGif

You did good. Fuck AMD and their sinking ship.

Yep. I did too. Welcome.

Nice. I recently did the exact same. Though you probably could've saved $22.98 by buying at a Best Buy or free shipping from Newegg.

Always do the opposite of what Sup Forums says.

You fell for it and bought a memecard. Congrats!

why the fuck did you spend $460 on a fucking 70 series card?
what fucking videogame requires you to get that over what you currently have?

>tfw never had to resort to AMD
Feels good.

Yes, good for you. I also joined team green and will never look back. #betterdeadthanred

yeah that card is a beast. i got one but i kinda didn't need it since i always end up playing rocket league

>that price

retard decision

Great decision. AMD will be dead by the end of the year. Nvidia is the future of PC gaming.

This. I shill the rx480 just to trick retards.

Welcome to the winning team my friend.

>mfw AMD CPU with NVIDIA GPU

the worst of both worlds

>1070
>memecard
>not the most valuable card this gen

Best of both user, I did the same.
Gtx 960+fx6300

This entire generation is a meme. Everything is sold out and people are happily spending 50-100 dollars over MSRP like good goyim.

Yes, good goyim. AMD doesnt belong on our good consumerist technology board :^).
Remember to upgrade next year when DX12 hits, GTX 1170 with Async Support* and a great 7.5gb RAM, only $550 MSRP.

>coupling a 1070 with a 1080p60 screen

Nice mem

Yea I bought an EVGA 980ti Classified when they first came out. $750. Boosts up to 1490mhz out of the box and can hit +150mhz to core and +850mhz to memory at +50mv. Pushing a single 32" 1440p panel at 60hz and am very happy with it. The new gtx 10xx series of cards seem HIDEOUS from an aesthetics perspective. I mean I have a non windows R5 case so I really don't care too much about looks but holy shit. Has anyone seen the 1080 classy? Fucking garbage.

I'll probably upgrade to the 1180 or 1080ti if they come out if only for the proper async. Until DX12 is mainstream I'm more than happy crushing any and all DX11 games on ultra 60fps 1440p.

I shill the rx480 because it's good and NVIDIA needs competition

Morally justified, but still faggotry.
The 480 is good but not great.

>rx 480
>good

TOP KEK

i hope it keeps you up at night knowing i spent $430 on a 1070 supporting mein jewish overlords.

I have the SC version, it's a beast

>Sup Forums is filled to the brim with rabid Nvidia fanboys
>OP thinks he's a special snowflake for buying Nvidia

??

I agree with you that Nvidia (and Intel for that matter) needs competition, but shilling for a company like a retard does no one any favors.

Playing devil's advocate to keep things in perspective? Helpful.
Spouting useless things like "nvidia cannot into async" when there are facts to the otherwise? Not helpful.

>Spouting useless things like "nvidia cannot into async" when there are facts to the otherwise? Not helpful.
But they cant
Nvidia already got sued for their 3.5gb scandal, they're not invincible you know

>1070
>$459.99

>Spouting useless things like "nvidia cannot into async" when there are facts to the otherwise? Not helpful.
Do you really think that Nvidia can do true async?

You know what they say about a fool and his money
here's your (you), use it well.

>nVidia is expensive

yea yea, maybe in the states, as far as I remember AMD has always been more expensive in eastern europe since 9600GT times. They sell only about 1 AMD gpu for every 10 NVDA gpus sold here (biggest retailer stats). And thats with them having the inferior product. Cant imagine the jewing they would do if they cards were actually better lol.

Nvidia literally took off the hardware required for Async and tried to force it through software so they can get that "muh PPW!!!".

>But they cant
Source?
I'll provide one that states to the otherwise since I doubt you will (can) provide a citation.

> In fact, next to the addition of GDDR5X, I’d consider the changes to work scheduling to be the other great change to the overall Pascal core architecture. With Pascal, NVIDIA has significantly improved their ability to allocate and balance workloads, which in turn has ramifications in several difference scenarios. But for the AnandTech audience the greatest significance is going to be in what it means for work concurrency when using asynchronous compute.
anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/9

See

thanks for the (you)

i upgraded from a 970 and it was a great purchase.

It's the FTW version with RGB lighting

...

Your source seems to be a circular argument, ie. not valid.
>If both AMD and nVidia are running the same code then Pascal would either gain a tiny bit or even lose performance.
His conclusion is based upon his expectations of the results of how Nvidia's async HE EXPECTS to function rather than any understanding or explanation of how it actually does function.

Also, just for the same of completeness, he seems to think that parallelism is the only valid form of async compute is parallelism which is complete hogwash. Async involves both parallelism _and_ concurrency. There's a rather good discussion of this in another Anandtech article, most notably with verbose explanations in the comments.
anandtech.com/show/10486/futuremark-releases-3dmark-time-spy-directx12-benchmark

Finally, that does not support the claim that nvidia cannot do async or that hardware async hardware was removed. At best it tries to pigeonhole async as being valid so long as it is AMD's way of doing async which no developer or designer or reviewer has claimed or agreed with. In other words, what you posted was a "no true scottsman" fallacy.

In fact, it actually helps prove that nvidia _can_ do async but does it in a different, and arguably less efficient manner likely because less cores are left idle thanks to a highly efficient scheduler for Pascal.

>he seems to think that parallelism is the only valid form of async
And he's right since it's objectively better.

Don't worry, senpai. When the 1100 series comes around to parallelism, you'll be one board.

>chart doesn't say which label is faster

ok jim

>And he's right since it's objectively better.
One method is better than the other and that makes the other method NOT async computing?

> Async compute itself is a catch-all term – there are lots of things you can do with asynchronous work submission/execution
You're committing the same "No true scottsman" if you are claiming concurrency is not async but parallelism is.

>Don't worry, senpai. When the 1100 series comes around to parallelism, you'll be one board.
I dont waste money on a new GPU every year because Im not a good little Goy.

Sorry.

>mudslime
>right
nope

There is no 1070 below 450USD right now. But at least that shit stays in stock.

1080 are impossible to buy unless you follow retailers on twitter and get email notifications.

Hint: Nvidia is faster

There's actually a bunch of them lower: nowinstock.net/computers/videocards/nvidia/gtx1070/

lol amd cpu's are trash pleb tier

>no proper async, Vulkan, DX12
>still has the faster cards (GTX1080)

dealwithit.wav

Unironically works better than AMD GPU with AMD CPU, because AMD still hasn't multi threaded their driver. Even tho they had years to do it.

I think someone at AMD claimed it was impossible, even tho nvidia did it. I don't even know what the fuck is wrong with them.

I know this is pointless to ask but now that you've been provided proof that Nvidia does async, "true" async that just isn't the same approach as AMD uses, can you give up the misconception that "Nvidia can't do async?"

no

It only does it on Pascal for it's crappier method. Prior to that it didn't and thus the Nvidia can't do AC was born. We don't say it cannot do AC anymore. We just say it uses a crappier implementation and Nvidia has enough cout to 'encourage' devs to use their implementation. I am not saying Futuremark were paid off since AMD signed off on it. But it would have been nice to see a path for both architectures offered instead of the one option that puts Pascal in a good (but not favorable) light.

Sup Forums rarely talk about GPU's and the technology behind them. Go there now and look through the catalogue. GPU's are technology and we discuss the technology behind them. Fuck off with this back to Sup Forums meme.

>It only does it on Pascal for it's crappier method
Debatable quality aside, are you acceding the point that Pascal does async?

>Prior to that it didn't
Maxwell was capable of async.
>From a technical perspective, NVIDIA has slowly evolved their work queue execution abilities over time. Consumer Kepler (GK10x) could only handle a single work queue, while Big Kepler (GK110/GK210) added HyperQ, which introduced a 32 queue setup, but one that could only be used with pure compute workloads. For HPC users this was a big deal, but for consumer use cases there was no support for mixing HyperQ compute queues with a graphics queue.
>Moving to Maxwell, Maxwell 1 was a repeat of Big Kepler, offering HyperQ without any way to mix it with graphics. It was only with Maxwell 2 that NVIDIA finally gained the ability to mix compute queues with graphics mode, allowing for the single graphics queue to be joined with up to 31 compute queues, for a total of 32 queues.
>This from a technical perspective is all that you need to offer a basic level of asynchronous compute support: expose multiple queues so that asynchronous jobs can be submitted.

>We just say
"We?" Are you including others like >But [Nvidia] can't [into async].
And
>Nvidia literally took off the hardware required for Async
These guys don't say that Nvidia is using "a crappier method." They say Pascal cannot do it.

>we discuss the technology behind them
Wait, so the technology behind modern GPUs is just corporate shilling and buzzwords?
Amazing...

...

Enjoy shit image quality, low res. shadows and shit DX12/Vulkan performance

wow you sound a bit mad there.
calm down and get a job.
you'll move up some day.

>giving me shit for being an informed consumer and not wanting to get screwed by novidya

Intel CPU and AMD gpu masterrace familia

If Time Spy had coded specifically to use AMD's optimized ACEs, which would favor one vendor (AMD) over another (Nvidia) they would invalidate their reliability to be provide a neutral benchmark. That would, in effect, be using code to enhance AMD's results.

Oh, and just to clean up technical jargon. Concurrency is parallelism. It is static parallelism.

R&D on nasa budget

You mean enjoy better performance than any AMD card available

>I don't even know what the fuck is wrong with them.

AMD is totally incompetent at this point. I mean they just released a video card that couldn't even meet the simple power limits of a motherboard. They are retards.

No. There should be two code paths upon detection of what the GPU currently being used is. If Nvidia use concurrent. If AMD use parallel. The days when you only had one benchmark path (DX11) are over. DX12 does not work like that anymore. It can use different paths depending on which you implement.

>There should be two code paths upon detection of what the GPU currently being used is. If Nvidia use concurrent. If AMD use parallel.
How are you going to write two code paths for the same thing? Concurrency is parallelism. What you are asking for is a different code path that is optimized for AMD rather than a path that is vendor neutral.

I have a 970 and perfectly happy with it. I will only upgrade when a card comes out at a reasonable price that can get 60 fps at 4K.

>What you are asking for is a different code path that is optimized for AMD

This is exactly what AMDtards want. They don't give a shit about objectivism, they just want to try to make AMD look good at any cost, including lying cheating and stealing.

You did good.

The BEST AMD fanboys can say is that MAYBE AMD's hardware async compute will matter some time in the future.

It cannot be vendor neutral since it favors light load AC and does not allow the AMD architecture to use it's full power. AMD can handle AC+graphics in parallel. Nvidia cannot. How can you test the full potential of each if one is not allowed to perform at it's fullest? I am not saying it should be at the expense of gimping Nvidia's performance in any way. Both GPU architecture's should be allowed to show off their full potential. A 'blanket' DX12 'benchmark' is worthless. Of course Nvidia fans are happy because it skews the results to make their GPU's not look so bad. I don't care if AMD signed off on this. They are dumb shits for doing so.

Just like GameWorks? Skewing performance in Nvidias favour and Nvidia blackmailing reviewers into using GWG's to make themselves look best ever?
Face it, the benchmark SHOULD use HEAVY asynchronous compute+graphics.
If Nvidia are too lazy to implement that, its their own fault. Nobody is stopping them from bringing true improvements/innovation.

You're just upset that GameWorks isn't going to be able to stop AMD in the future.

>It cannot be vendor neutral since it favors light load AC and does not allow the AMD architecture to use it's full power.
Would using its full power require low level optimizations that would specifically benefit one card over another?

>How can you test the full potential of each if one is not allowed to perform at it's fullest?
You don't. That's left to each game developer to work with either vendor to get out the best of either card. Thus why we see Doom getting such good results for AMD cards. It takes full advantage of the shader optimizations AMD provides but nothing at all that Nvidia provides.

That leaves but one choice. A high level approach that highlights the basics both cards are capable of without any unfair optimizations.

>That leaves but one choice *if you want to have a reliable apples-to-apples comparison.
Fixed.

Been waiting for over a week constantly checking sites for the OC Nitro+ 8GB rx 480.

Feeling like I should just use the igpu from my 4790k to tide me over until Vega... probably September time. Or maybe I could wait until August 1. I was hoping to get the 480 by August 1 at least so I could have it ready by my birthday the next day.

If you can wait for Vega, it'll provide a few options:
1. Vega could be good, could answer Nvidia on the high end segment.
2. Vega could be meh. Though with the release of Vega, Polaris MAY go down in price slightly. Even if not, there would be stock again at least.
3. If Vega isn't any good, consider Nvidia if that's something you're willing to do.

I'm with you on this struggle. I fell like bitting the bullet and buying a Fury for $350 right now. The GTX 1070 is too overpriced, and I cannot get a GTX 1060 for a reasonable price either.

>If Vega isn't any good, consider Nvidia if that's something you're willing to do.
DX12 hype bit me too hard to consider that. It's either get the 480 or igpu -> 490 for me.

Fair enough. I don't blame you, especially not considering I've got a Fury X on a custom loop so don't plan to change until Navi (or maybe go crossfire.)
In that case, unless you really can't bear the iGPU, you'd definitely get more value out of waiting because its always worth paying out for a higher grade card if the price increase isn't drastic.

You're just spewing more bullshit. Time Spy was literally created with AMD's input. It's a balanced benchmark but you're mad because it shows AMD cards are just plain bad, even in DX12.

Nvidia cards are literally better than AMD at DX12. See Time Spy, the new DX12 benchmark, where it clearly shows this.

I've seen those charts and benchmarks and I'm aware of the 1060/1070 performance. The other thing nagging at me with this choice is the async thing shown in and this feeling that AMD may have more advanced stuff for the not-near future. I may get the 480 soon and then use that until some 590, for example. The performance difference between the 480 and 1060 is negligible to me considering I've been playing an 8850M from my laptop for the past two years, and nothing better than the R9 280X. 1060's 6GB is also too little for me.

People regretting getting the 970 because it actually has 3.5GB instead of the full 4GB also affects my perception of Nvidia cards in the negative.

>i want more than 6gb of vram
For what? If you are playing at higher resolutions than 1080p the 480 is garbage anyway. It is just classic amd padding out their hardware stats with things they cannot make proper use of.

>this feeling that AMD may have more advanced stuff for the not-near future

That's complete bullshit. AMD sadly doesn't even have the money to create anything advanced anymore. You can see this how the newest card can't even compete with Nvidia's midrange. If you want long term support, definitely go with Nvidia because their cards will be far better supported than anything by AMD.

but they forgot about my keplar card..... :c

Maybe video rendering and encoding?

...ok. Which of the new nvidia cards near $260 should I get that are sleek like the Nitro+?

Maybe I'll just get a 590 whenever it comes out or whatever the Nvidia counterpart would be.

>$459

you should have waited, its still way too overpriced for the card.

>a card that costs 27% more is 12% faster

Wew lad

Yeah, and doesn't even burn your motherboard or hit 90c on load

That's a whole fuckin year to wait.

Is there anything in the next couple of months at least?

I'm in a dilemma too. I want a Freesync monitor. That pretty much rules out Nvidia. But I don't want a shit monitor either. I would prefer 1440p and keep reading about shitty Acer IPS, not so hot Asus limited Freesync range and IPS QC problems. Then I look at BenQ and it's a TN panel. But AMD has nothing new to drive it and it would be stupid to buy a Fury X now.

Are you the kind of idiot who buys stock or "Founders editions" ?

>$460
>to play computer games

Nah, but lots of people are considering the amount of shilling this board experiences on a daily basis.

did exact opposite - no regrets
imagine that!

Considering ur mum is for free, I can see your point.

Good on ya lad.

That really depends on your environment. If you live in a place with good aircon or where the temps in your room rarely hit 28c in the summer then it should not be a real issues. The Powercolor barely breaks 69c under load in a 21c room. Add another 8c and then it's 76c. Big deal. If you are buying a ref 480 you are gonna water cool it anyhow. Aren't you? Well you should be. Heat is only an issue if it's hitting over 100 on a regular basis. The real consideration should be fan noise and power usage if you are poor. Again the Powercolor is pretty damned quiet under load and in quiet mode nearly matches the 1060 by 1% difference.

> $0.20 has been deposited into your account.