I wonder if that shill posting walls of pasta and amd marketing slides has killed himself

I wonder if that shill posting walls of pasta and amd marketing slides has killed himself.

Other urls found in this thread:

forum.gigabyte.us/thread/2549/aorus-z370-sounds-problem-realtek?page=11
rog.asus.com/forum/showthread.php?97220-Maximus-X-Audio-Popping
forums.geforce.com/default/topic/1026652/geforce-drivers/387-98-causes-dpc-sound-crackling-and-poping-updated-still-happend-in-388-00/6/
gearnuke.com/amds-new-crimson-drivers-bugged-killing-graphics-cards-report/
youtube.com/watch?v=OM5HQmx1w8U
blogs.nvidia.com/blog/2017/12/18/how-deep-learning-detects-eye-disease/
twitter.com/SFWRedditImages

>hard'o'cp.com
Enjoy your v&.

Oh well. For the money I paid (£380 for a RX Vega 56 flashed to 64 vs £450+ for a GTX 1080) I think it performs OK enough. Plus I avoided the Goysync tax.

THANK YOU BASED NVIDIA

Fucking scammers, they lied about latency also.

MAGIC DRIVERS ARE COMING JUST NEED MORE ADRENALINE

India strong

If they still cost that much, yeah Vega 56 would be a great value if the power consumption isn't a factor.

But they cost almost double that due to miners whereas 1080s are comparably cheap and actually good performance/$.

SHUT THE FUCK UP WAIT FOR SPECIAL FEATURES

>Flashing the BIOS to fake a 64
The pinnacle of "w-well I mean It's worth the money"
Flashing to a 64 bios has been proven to provide only 4-7 fps gains, and you have to undervolt which I'm sure you already did. Bet you have the benchmarks ready to go at a moments notice incase anyone disrespects m'vega

curious, did you ever care about power consumption before in the past? say the nvidia 200 series vs ati 4000 series? 400 series vs 5000? 500 vs 6000 series? i say that because during those time frames amd had superior performance per watt performance. yeah they didn't have that single top dog, that one tier, but everything was on par in performance but used less power. it wasn't until nvidia came out with keplar did i start noticing people giving a shit about power consumption. i remember when the 7970ghz edition became the top dog graphics card no one gave a shit and i saw a lot of people go on and on how "the 680 might be slower, but it uses less power! and i can always overclock! (which will use more power but who cares!!)." which i find ironic because the 5870 vs 480 was like 10% difference but the 5870 used less power.... and no one ever gave that same treatment to the 5870 as they did with the 680. even drivers back then were barely any difference quality, stability wise. especially during the 200 series vs 4000 series one could easily argue amd had more stable drivers than nvidia back then.

Everyone cared about power consumption
Everyone made fun of intel for their power consumption everyone made fun of Nvidia when they fucked up their power consumption except now that AMD has been fucking up their power consumption for like 80 years straight faggots like you type "well to be fair you have to have a fairly high IQ to understand AMD's power consumption m-muh double standard I swear" and one could not easily argue AMD had more stable drivers back then, they've notoriously been shit. Sometimes they weren't though for like a week

Just grow up

>Everyone cared about power consumption
then explain why nvidia sold more 260's than amd sold 4870's and 4850's when the 4870 performed on par and used less power?
explain why nvidia sold more 470's when the 5850 was on par and used less power
explain why nvidia sold more 480's and no one gave a shit about the 5870 when the 5870 might have been 10% slower but used much less power and didn't get nicknamed the flamethrower like the 480? 6950 was faster and used less power than the 560 and about as fast as a 560 ti and still used less power... yet nvidia sold more 560 and 560 ti's than amd sold 6950's and 6970 combined. i can go on and on because if people gave a shit about power, no one bought nor acted like it. people didn't seem to ACTUALLY care until nvidia cared. people may have bitched but man did they not act on it.
all during that time frame amd kept loosing marketshare slowly.
>AMD has been fucking up their power consumption for like 80 years straight
i'm starting to think you're a retard
> and one could not easily argue AMD had more stable drivers back then, they've notoriously been shit.
yup you are the retard. microsoft back during the 9000 - 200 series days came out stating ~30% of bsod were caused by nvidia drivers. 6% from amd/ati. most TDR's were caused by nvidia drivers. hell, nvidia still has their PSA about TDR's on their support forum page that's dated back to that time frame. amd drivers didn't start going down hill until the start of the hd 7000 series. went months on end without releasing new drivers but that wasn't the case for the longest time prior to first generation gcn.

...

>amd drivers didn't start going down hill until the start of the hd 7000 series. went months on end without releasing new drivers but that wasn't the case for the longest time prior to first generation gcn.
and these past two years amd has really stepped up their plate with driver releases. now amd has been releasing like a new driver every month now. looking at complaints on both amd and nvidia driver forums it really seems like amd and nvidia are equal now in terms of stability and features. though nvidia does appear to be having issues with z370 audio solutions due to high latency with nvidia drivers:
forum.gigabyte.us/thread/2549/aorus-z370-sounds-problem-realtek?page=11
rog.asus.com/forum/showthread.php?97220-Maximus-X-Audio-Popping
forums.geforce.com/default/topic/1026652/geforce-drivers/387-98-causes-dpc-sound-crackling-and-poping-updated-still-happend-in-388-00/6/

Everything you said is completely false and there is absolutely no point in even trying to make retorts for your ridiculous nonsense you spewed. It's blatantly obvious by the way you type that you're either an underage or an RTG Indian trying to make some extra overtime for Christmas presents for your 12 kids.
I'm sure they'd love it if you gave them an education which you've clearly missed.

Why are you responding to yourself with cherry picked articles

>AMD has been fucking up their power consumption for like 80 years straight
hello 1999 - 2006? intel housefire pentium 4's vs athlon/athlon xp/athlon 64 all of which were as fast or faster and used less power and produced less heat? amd didn't really have power consumption issues with their processors until bulldozer. power consumption with video cards wasn't really until hd 7000 series but the differences between it and keplar 600 series wasn't that big. wasn't until 300 series did amd REALLY fucked up because they refreshed hawaii with the 390 / 390x to compete with the 970 and 980. instead of using a cut down version of fiji... which sadly was a year late. fiji power consumption against maxwell wasn't nearly as bad as people made it out to be. fury x was about ~35 watts more on average than the 980 ti and fury was about the same amount vs the 980... but at least the fury was faster than the 980 but sadly the fury x was as fast or slower than the 980 ti.

yeah the hd 2000 series from ati that was 100% designed and built prior to amd buying ati but to give them credit, 2000 series was a completely new architecture over their prior 9000 series and it wasn't until the 3000 series they got it figured out. 3000 series used less power compared to nvidia's 8000 series and competed well with it. just like the 4000 and 5000 series vs 200 and 400 series from nvidia. gcn sadly will always use more power as its by design since its trying to be hybrid between compute and graphics. to allow that parallelism requires a lot of transistors and more hardware on the card. all of that = more power.

Just stop samefagging

>post evidence of nvidia having driver issues
>hurr dur cherry picking nvidia does nothing wrong!!!
>i has no sissies wiht da nvidia so no one else daeos too!!!!xDDDDDDDDD

>inb4 was posting on phone or inspect elements the (You)s away

...

It's hilarious how AMD pajeets switch form "AMD is gonna destroy X" to a victim card when their product is released and is literally shit.

Yes, I somewhat do.

I have a 7970. And while it's such an amazing card that I still get 120+ fps on tons of new games, I do not really want another 300+watt monster unless it does 4K at 90+fps 0.1% minimums on pretty much every game.
But no card does 4k 90+fps on all games at any wattage, so no card really justifies 300 watts for me.

But yeah I guess for 99% of games, Vega probably stays under 100 watts where my 7970 would pull closer to 300.

And don't forget, when the 7970 came out people were ragging on Fermi (justifiably).
7970 had a power consumption that justified its performance. Fermi and Vega don't really.

My 7970 is still so good that I don't really need to upgrade. So I guess I'm waiting for Navi or some shit.

You tried

I want r/amd rant! time nvidia release next GPU family, wait for navi...

this nigga really asking for us to post the 175 page list of every time AMD fucked up their drivers and never listened to the complaints

remember that time their driver update literally fried cards by locking the fans or the time they released a graphics card lineup with nodrivers (right now)

you a silly billy

>remember that time their driver update literally fried cards by locking the fans
That was Nvidia

AMD had a similar issue happen in 2016, but it wasn't as bad and didn't kill any cards.

Are nVidia drivers really that bad?
I had a 780ti and now a 1080, and the only early issues I had were resolved with after a few days with a new driver.

I had a 5970 and that pile of shit never worked right.
And my 290x died 2 weeks after i got it.

Yes it was and yes it did
gearnuke.com/amds-new-crimson-drivers-bugged-killing-graphics-cards-report/
killed a bunch, you can find complaints on the forums that warranties weren't honored too

show me an article about this and i'll believe you beyond some random newly made geforce account

Use frtc to cut down power consumption.

It killed a lot of cards at least 10-12 we know of

there was a whole thread about it back around a month after it happened last year, I used to have the compilation picture but I'm out of country right now away from my comp

>fried cards by locking the fans
wouldn't some kind of thermal protection kick in and shut off the card?
inb4 only overclocker fags who turned off all safety features were affected

Who the fuck cares. I get 2000 h/s in cryptonight. Already paid for itself in the 2 months I've had it.

Nope
By the time the cards shut off and computers rebooted they were dead. They flew up to 100+C within a few seconds, hard to recover from that

the cards don't usually shut off until getting within like 2-3 degrees from 110 celsius but if you hit that really fast by having your fans locked it can kill a card easily
heres a vid of someone who got his card to run a little longer by luck
youtube.com/watch?v=OM5HQmx1w8U

If the temperature could spike fast enough to kill the card instantly with a heatsink attached, wouldn't cards randomly die all the time because of that anyway? The heat has to transfer to the heatsink in the first place before it can be forcibly transferred to air. Fans don't just magically zap heat away from the die.

No
Rapid jumps of 30-60c or 40-70 or 30-40 are no issue
It's rapid massive jumps from 30-105 celsius plus that kill cards

How can a GPU with a heatsink attached (regardless of fans) possibly put out enough heat to reach dangerous levels faster than it can throttle itself? It would have to happen so quickly that fans wouldn't matter at all.

The video in isn't "luck", it's entirely by design that when he blocks the fans the card throttles to keep from heating up any more.

afaik that is only Fury and later.

Just remember:

>“People (with AMD) are suffering. Everywhere you look, there’s a blurry spot in the center.”

blogs.nvidia.com/blog/2017/12/18/how-deep-learning-detects-eye-disease/

where is my power efficient budget vegas?

available right now from nvidia

sad loser

no, he mostly posts on reddit now though

I specifically didn't get a 290X at launch because of the astronomical power usage, and that was from me with SLI 770s.

Just wait™

AMD is like a fine wine™

>curious, did you ever care about power consumption before in the past?
Do you not remember all the Fermi house fire memes?

It was the same thing a few years ago with those other drivers, whose name I don't remember anymore. They were supposed to magically increase the overall performance by who knows much, but they ended up doing nothing. Overall performance gain was 0.x%, with some higher gains in a few specific games, like usual with most driver releases.

Despite this, AMD advocates kept saying that the drivers improved performance by a lot, even though every benchmark proved otherwise. Any benchmark that was conducted before the drivers were released was now invalid in their minds, so now they had another excuse on top of the "that site is paid off/shit/whatever" to explain bad performance in specific games. This made any sort of argumentation impossible.

So even though you link articles like that that show there being no increases in performances, they won't care. They'll shut their eyes and ears and pretend that the drivers did in fact improve performance.

>Flashing to a 64 bios has been proven to provide only 4-7 fps gains

Are you implying that's a bad thing, my dear retard? Why would you not take 30 seconds to flash a BIOS if it offered even a small performance uplift for free? Putting aside the fact that 7fps could be a 20-30% improvement at 4K.

>If the temperature could spike fast enough to kill the card instantly with a heatsink attached, wouldn't cards randomly die all the time because of that anyway
You know that's exactly what happens to pascal cards. Especially to EVGAs which have shitty VRM.

Bull fucking shit. Go ahead and physically block your fan from running while running a game. The GPU locks up without any damage.

The bios only oc the memory, the number of core are the same than the V56