SO IT BEGINS

>custom card
>can't overclock for shit
NVIDIOTS ON SUICIDE WATCH!

videocardz.com/60631/asus-rog-strix-geforce-gtx-1080-offers-poor-overclocking

Other urls found in this thread:

diy.pconline.com.cn/795/7950003_4.html
youtube.com/watch?v=eMr7grvBljk
twitter.com/NSFWRedditVideo

This gtx 1080 meme came faster than op does when he ejaculates prematurely

still better than anything from MAD

I'd give a shit if it wasn't already the fastest card in the world

sick burn m8

Why do you need to overclock a video card that outperforms every other video card in existence?

Because people were able to overclock just about any GPU that outperformed every other GPU in existence before.

Because it still can't run crysis

And now cards come pre overclocked.

This is an improvement. Instead of wasting time overclocking it yourself, they do it for you.

>wait for custom board they said
>cant OC better than Fanboy Edition
>b-but it is already the fastest card
you nvidiots are a joke

Shilly willy go

>buying nvidia
Enjoy your gimped graphics i suppose.

How 2 get subzero tempz on the jew 1080... BE A GOD GOY AND BUY FOUNDER EDITION WC LOOP

I'm not a gamer idiot. I buy nvidia because of the sweet cuda cores.

MAD cant compete anywhere. fails in encoding, no V-ray tracing, no CUDA, faisl at gaming, fails at CPUs.

literal shit copmany.

But it's already factory overlocked. Would you rather have a factory clock of 500Mhz so you can boast about a 1500Mhz overclock instead?

...

I tried overclocking my 7970 a few months ago first with MSI Afterburner and then AMD's Crimson and it completely rekt my graphics card. It now only works if i don't install drivers, if i do install drivers the graphics card crashes as soon as i hit the desktop.

I'm too scared to overclock again.

you guys are fucking retarded.
the custom card have 6+8pin and 10 phases.
It supposedly OC better then the FE but it performed worse than it. That's the point.

>Starts talking about MAD
>nobody is talking about MAD, thread has nothing to do with it
Are you sure you're not MAD yourself? Kek'd

This is as much an argument against nVidia as Gameworks is against AMD.

Meaning, nothing at all.

Why is it called the 1080?

Because your mom sucks that many dicks in a single day.

college wasn't free

Welcome to silicon lottery, your first time? Or just enjoy shitposting?

Kek

This post is PURE damage control, holy shit. I'm really starting to believe that there are paid shills acting on forums and imageboards.

>inb4 muh MADrone

I'm a nVidiot myself and I'm able to see this, if you're not, you're just retarded.

you is muckin about.

my 7970 is much easier to oc than my 4870.

It's nothing to do with the silicon lottery, you dumb fuck. All 1080s are voltage locked, presumably because they're Fermi-grade housefires and would literally explode if you put more voltage through them. This kills the overclocking.

>overclocking
>2016
>ishygddt

>Small update, PCOnline tested their STRIX card as well. They manged to achieve 2088 MHz (but only in 3DMark, so it's definitely lower after 20 minutes of burn-in test). They said it crashes on 2.1 GHz though.
>diy.pconline.com.cn/795/7950003_4.html
kek

>factory overclocked card
>can't overclock much

oh really? Color me surprised.

This

>nvidia's advertised 2.1ghz is an overclock
kill yourself. Nvidia lied about literally everything they said during the release.

>this damage control

It has nothing to do with how much it can overclock beyond the factory overclock, you dumb cucks. Nvidia claimed that the reference card could hit 2117MHz on air at 67 degrees. Yet even none of the fancy custom cards can get anywhere near that so far. The fact that they're factory overclocked is completely irrelevant. Why can't they even reach the speed that Nvidia showed the reference card allegedly doing? Because Nvidia lied of course.

You sound like a faggot.
I can max the voltage slider on my 280X and it doesn't give a shit outside of hitting mid to high 80s

>ll cards are locked to 1.25V maximum
>when the voltage gets close this value, card become unstable.
Sounds like the average silicon lottery to me. Stability issues at high voltages and clocks. Do you really think unlocking the voltage would yield any other benefits than an extra heater to your room with a card which performs poorly even within voltage the limitations it has? I'm waiting for more reviews on the cards from multiple sources. That 2050 Mhz clock isn't even too bad considering the base clock of the FE. Never seen clock rates that high myself even if they fall short on what we saw in the conference.

>Nvidia claimed that the reference card could hit 2117MHz on air at 67 degrees.
They indeed showed a card which did that. Temps are something you should take as a massive grain of salt as the test conditions have so many variables which effect this.

>Yet even none of the fancy custom cards can get anywhere near that so far.
How many have you seen measured so far?

They wouldn't have lied.

They would simply have cherry picked the single best card out of the thousands they had made and then locked the fan on 100% in a cold-ass room

you are equating Gimpworks with DX12

The issue is not with the temperature user.

The initial review suggested that the card were gimped by a single power connector, and that with the extra power the card can achieve the showcased overclock.

But that turn out not to be the case here.
Even with custom PCB and mods, the card simply can't go over 2100MHz before things gets unstable.
Not even the extra wattage could help as whenever you try to go over the 1.25 voltage limit the card simply shat itself.

Fucking sick burn brah

That's exactly what the silicon lottery is.

My old 2500K could hit 5.2ghz on 1.48v, other shittier 2500Ks could barely maintain 4.5ghz at the same voltz.

Nvidia would have looked over the thousands of chips they had and found the one that was able to hit the highest (2200mhz) at 1.25v, and then used that one as the 'norm'. The low temps would be because it was running the normal voltage and that card probably had better thermal paste along with possibly even giving the fan more than 12v.

None of that matters, you cant say anything at this point. This one card PROVES that pascal is a failure.

You are trying way too hard, downright comical if serious.

...

>then used that one as the 'norm

Why does nvidia always do shit that ends up backfiring?

I think they do it so they can charge ridiculous prices and once the skeletons start coming out they can just lower the price a bit and still make it look like a good deal.

These retards aren't switching on the blue LEDs to keep it cooler while overclocking.

I get it, Nvidia lied. Sure they lied, but seriously, what does AMD have to offer?
I have the money to buy the card without sacrificing anything, if I buy it I just get the card and I get to play my video games because what the fuck else do I do with it?
So I can buy the card right, I can also buy anything AMD has to offer - wait a second, do they even have anything to offer? No right? Polaris? The one targeted for "budget" shit? I'm not on a tight budget, and this card outperforms anything so what's AMD going to do to stop me from buying this card?

>Why does nvidia always do shit that ends up backfiring?
they got larger fanbase and loyal die hard fans. It will really not backfire since those fans will just goes into "ITS JUST A SILICON LOTTERY, NOT THE CARD FAULT" mode.

>I-I d-d-d-dont need good IQ
>m-m-muh FPS

good goy

I still wonder
>nvidia like >70% market share
>amd like nothing, probably similar with intel and their own shitty apu's
>on Sup Forums the AMD shill force outnumbers every other party by 100:1
how and why

if you have a 390/980 or better you donĀ“t need to upgrade w8 for 80ti or vega senpai

>shill
because you have no clue what that word means

>but not only game
>and got game
>by FaithLV for /r/pcmasterrace
hey guys buy AMD cards I don't care but can you please not spam reddit shit bait here? what the hell is wrong with you
kill yourselves

right
the people here hating on nvidia and recommending AMD cards everywhere I look are not shills

because i've noticed every amd owner i know never actually play games and always like to meme amd shit on r/amd and some on Sup Forums. every nvidia owner i know is some normie that doesn't even know amd still make gpu's and all they use their computer for is torrenting games and playing them.

tdlr;

>amd owners = neckbeard loners that never play games and like throwing benchmarks everywhere

>nvidia owners = normal fags who literally only use their computer for games

Sup Forums is all about being hipster as fuck.

>amd = compute shit
>nvidia = gimped as fuck can only into games

i wonder why senpai

Keep up the good fight goys, your company appreciates it.

>dank meme the post

don't worry Nvidia owners only care about meme image quality bullshit like pcss and hbao+ and other meme features. actual game graphics are of little importance.

>amd
>any computational tasks

KEK, my friends r9 390 doesn't even work with autodesk maya because of shitty drivers. he has to roll back to some 2011 drivers to get one of the tools to work which inevitably makes current game performance shit.

on the other hand i use nvidia CUDA in premiere pro so if anything nvidia is the compute stuff and gaming stuff and amd are just there because they are. literally doing nothing for the industry.

that's weird you sure sound like a gamer idiot

you had us all fooled big time

That word doesn't mean what you think it means.

Furthermore, the large Nvidia marketshare is largely due to their marketing and appealing to the masses. They do this via mass marketing with websites, blogs, reviewers, etc, giving them special conditions for giving their products good reviews and playing nice with them. By doing this to a much larger extend than AMD has ever done, they managed to sway the public's perception of both brands. And while both AMD and Nvidia are actually rather close when in comes to performance on most cases, and both of them are similarly priced in most cases, the marketshare of Nvidia vs AMD is a landslide. It's gets even worse, when you realize that most innovation on the field is done by AMD (DX12/Vulkan, HBM, list goes on), while everything Nvidia does is locked down behind a black box for them to use and no one else (GSync, GameWorks, etc). All of this, while Nvidia gets so much more problems with their hardware than AMD ever did, yet AMD gets much more flak whenever anything of the likes happens to them, and Nvidia's are covered up and accepted as if it were nothing.

Regardless, being a fanboy is retarded. You should always get the best product you can for your dollar. And in most case scenarios, except the top of the high end, AMD is likely the best, except for some unique scenarios like mATX and shit.

Pretty much sums up why most people here prefer AMD over Nvidia. It's not that their AMD shills, they just dislike Nvidia shady shit.

>their AMD shills
they're*

Country and goals?

>shilling this hard
>for free
CLUCK

>most innovation on the field is done by AMD
>le amd had a hand in some gayming tech which means they've innovated the most
>amd gayming tech = lead innovators in all forms of the tech industry

kill yourself

Serbia. Remove kebab.

Sweden, one united north under the king

You didn't provide any counter examples now, did you.

I'm not saying Nvidia doesn't innovate either. The difference is, like I said, Nvidia keeps it to themselves and puts a black box over it, so that no one else can look at it. On the other hand, AMD shares and makes everything open source, which actively contributes to progress. DX12 being based largely on Mantle is a great example of this.

What's Nvidia ever done that helped other technologies (not made by Nvidia) grow?

>I don't have an argument
>Better call user a shill
Remove yourself from this board.

Why is it impossible for you guys to have a discussion without taking sides on a fucking computer hardware brand or jumping into conclusions early?

>Why does nvidia always do shit that ends up backfiring?
Is it really just Nvidia? You really think this is going to backfire now when the card performs better than any other GPU on the market while drawing moderate amounts of power, having normal temps and with no competitor? There are people who are obviously going to care about the possible limitations of OC and people who are too gullible to take Nvidia's marketing and hype at face value in who are disappointed probably every single time they release something. Whether the card is a good deal depends on how it performs relative to the competition. Currently there is none and the card makes Nvidia's own previous flagship basically obsolete. If they were proper jews they'd sell this for a lot more until competition forces them to do otherwise.

If we are discussing the OC headroom of the card in the article, it's clearly just the silicon lottery. Whether the card in the article or the one in the Nvidia conference is closer to the average unit is impossible to say right now. We need plenty more tests.

Happened exactly like I said. No reason to upgrade from 980ti. In fact just never buy non-ti cards.

>Nvidia keeps it to themselves and puts a black box over it

so everything nvidia has ever produced has a black box over it? care to show those sources which show nvidia's workstation cards all use black box software

its a simple question with a simple answer. any reply without a source about nvidia and their workstation innovations using "black box" software is damage control and all your posts will be laughed at and discarded like the bullshit they are.

It's even worse when you consider that the 1080 only really beats the 980ti by like 15-20% when the 980ti is not OCd. It wouldn't be convenient for Nvidia to match the 1080 against an OC'd 980ti, because they'd be too close for comfort to show how good the 1080 is supposed to be. But that didn't matter, since it was a stock 1080, and we had yet to see an OC'd 1080, which would happen with the aftermarket coolers, of course.

Now that news are out that the 1080 doesn't OC too much even with a custom PCB and aftermarket cooler, that OC'd 980ti is looking preeeetty close to the 1080. There's really no reason to pay almost double for a 1080, when the 980ti is pretty much the same. Unless, of course, Nvidia starts conveniently forgetting to push performance updates through drivers for the 980ti... :^)

>the country that surrendered to their turkish masters
>removing kebab

Ha

>NVIDIOTS ON SUICIDE WATCH!

Nah, the opposite is the case.

Have read various reviews to the GTX 1070 now.

It will be a bestseller card for sure.

I mean, you're the one that's claiming Nvidia doesn't put a black box over everything they do, so the onus is on you.
I for one, can't remember a thing Nvidia did and made opensource for everyone to use, but since you clearly remember, maybe you should be the one providing the sources, how about that?

Thankfuly the topcard of AMD is a great overclocker

Oh wait, it can't even overclock 1 mhz

Neither AMD nor Nvidia ever released specifications for their cards. They all are black boxes. This is the reason open source drivers for them are such a joke.

>The Nvidiot instinctually begins comparing his bad card to older AMD cards

>15% slower than a 980 Ti once both are overclocked

youtube.com/watch?v=eMr7grvBljk

Anybody with half a brain will buy a 980 Ti when they start selling them off cheap to clear stock.

I wasn't talking about card specifications, I was talking about technologies. AMD opensources a lot of their stuff and creates new standards for the industry, while Nvidia keeps them to themselves.

>he spends time writing a shill post (for free)
>gets told to provide sources in his posts
>he obviously can't
>"b-but i-i didn't say nuffin"
>"y-you should be providing sources"
>"S-STOP QUESTIONING MUH BELOVED AYYMD REEEEEEE"

kek

kek

for the mad respekt broseph. nobody actually overclocks for performance

1070 uses much less wattage power, supports DX12 HEVC 10bit 60FPS encoding, newer technologies and has 8GB of RAM

Nvidia spent a year overclocking Maxwell...

Trying too hard only to make yourself look like a retard. I don't care for brands, I care for what's my best option and what they contribute to the industry. But whatever floats your boat, Nvidiot

BUTTCASTRATED HOUSEFIREVICTIM DETECTED

its a damn good thing i DONT EVER overclock

>he still can't provide those sources 5 posts later

kek

KEK

K E K

Nope

First of all it's the Founder's Edition, which anyone with half a brain won't buy.
Custom designs are better, which is not speculation.

And not anyone want to have double the power draw.
The GTX980Ti max OC will become a powerhungry monster.

Also 2GB less ram.

Even if they are sold at the same price, I will go for the GTX1070.

NVIDIOTS AND AMDRONES ON SUICIDE WATCH, THEY'RE FINISHED, DONE, BTFO AND BANKRUPT

>25% less power consumption
>4% faster
>150 bucks cheaper
Into le trash le fury x go's

>And not anyone want to have double the power draw.

980ti is about 25% more powerdraw, not 100% more powerdraw

and to think amd will be selling the fury series as their high end cards for another 8-12 months before vega hits. i can't see this ending well for them considering their top card is worse than the mainstream nvidia card.

What are you saying?

Have you even watched the video?

1070OC vs. 980Ti OC is -7% performance on average, not 15%


Stop talking shit when your link doesn't even support your statement!

If they want to clear any of their inventory they will obviously have to slash prices on the fury x big time.

Or nobody but blind fanboys will pick it up over the 1070

Stupid shills should know better than to promise overclocking potential.

CUDA cores...2014 only about 250 CUDA applications none of which you fucking use.