GeForce GTX 1060

the RX 480 arrives and then nvidia crashes the party with the 1060

Other urls found in this thread:

youtube.com/watch?v=4sx1kLGVAF0
twitter.com/NSFWRedditVideo

Paper launch.

Crash with what? 2000 units globally?

More edges than my 14 year old nephew, how fitting for an Nvidia product.

Is it gonna be 5% slower than the 480, use 10 less watts but cost $240 starting?

BUT MUH EFFIENCY XDDD
10WATTS IS LIKE 100 MAAAAAAN

Every 30W expelled increases room temperature by 3C man

AMD shills already on full force

>gimping perfectly good 1070 chips
1070 are already gimped 1080 chips and there's barely any around. You're a moron.

How many of these threads do you want to make Jin Yang?

Do you get commission for each one?

none unlike you Rajesh

How could this happen?

So now we will be awaiting even more bench marks. How is a man supposed to buy a card?

>380x fighting 970
when did this happen

I hope that's not real, but I'm sure it is

Thats why you open the window...

It doesn't seem to be fighting it very well except on select settings, like medium on Metro. Bump the settings to very high and you get 33 vs 49. Almost 50% advantage for 970.

That said, the numbers look fairly inconsistent, so I wouldn't put that much faith in them.

>live in arizona
>its 110f out

It's gonna be as fast as 980

AMD = FINISHED AND BANKRUPT

god that's ugly

It's normal. How fucking new are you retards to GPUs?

Every new generation is just more or less a move up or down a tier. The only difference this generation is that the jump is a bit more pronounced since we haven't had new GPUs (9/2014) in almost 2 fucking years.

So it makes sense that the 480 is more or less the same as a 970. They'll be trading blows and effectively make the 970 a shit in terms of price/performance against the 480. Nvidia are just being little pricks with fucking prices right now.

I hope Nvidia waits until a bunch of tards buy the 480 only to release the 1060 and beat it for the same price just to teach them a lesson and have everyone make fun of their builds. They can afford to do it too.

You just buy the Nvidia card you can afford and be safe in knowing it won't be beat.

lol

fake

delete this

lol

How much is that 970 OCd? Because the ZOTAC AMP Extreme for example is almost 30% faster than a stock 970, that's nearing a 980

DELETE THIS SHIT

This is clearly biased against Team Green....

From the picture we don't know if those crazy poles used the 480 driver (thats currently under NDA) or simply used the latest beta AMD driver which has absolutely zero support for the 480.

If you say so user.

So you're saying AMD has no drivers.

THANK YOU BASED NVIDIA

This is the first gen where Millennials were able to afford to buy the card above midrange. They now can't deal with the reality that their 970 barely midrange.

Takes a very far gone AMDrone to actually imply that AMD's Black and red GPUs aren't edgy gamer gear as well.

Yeah but it won't suffer from driver overhead, making it a much better budget GPU

Is it safe to say it will be comparable to the 980 speed? What's the 960 compared to the 700 series?

This generation is a little special because it's the end of the massive 28nm stall.
This kind of generation improvement is what was normal, 5 years ago.
It's only recently that shit stalled and companies were competing against their own previous generations

so my 280 under full load should be able to boil me an egg...

If the GTX1060 costs around 279 or 299$
I will buy it, instantly no regret.

I wonder if anybody plays those games.

1060 has 1.5 times less cores and probably also lower clocks than the 1070. Nvidia will have hard time selling 1060s because the 470 and 480 will outperform them for less bucks. Just by looking at the raw technical specs it cant be in any way faster.

EPIC MEME FAMPAI

Yes, it's a special generation but it's not overly so. Instead of a simple move up the tier you also get a nice boost as well. The 2year/28nm delay is a huge part of it. It is still within normal expectations of anyone with any recent knowledge of GPUs.

The same shit has been happening on the CPU side but in that case there's also the fact that there's no competition to factor in.

This is really bad.

The 480 is worse than a 970, it can't even compete with a 2 year old card at the same price point.

bye bye amd

it's an oc'd 970 at 980 levels

Looks fine to me.

>tfw it will cost $400-500 in australia

Maybe even $600, since the RX480 is $400+.

Well but you can just buy a Zotac 970 for the same price anyway, so not much point debating which edition. The 480 will also have """OC""" editions.

Lol no it's just a normal Zotac OC card

Where da 1050 at?

raping the RX 470 and RX 460s

Is there any reason you can't buy from a US/Canadian source and get it shipped over?

>What's the 960 compared to the 700 series?
A bit slower than 770, apparently.

thank you

So basically it'll be around a 970. 15% better than a 970 at best.

I know for sure that historically AMD's performance in Witcher 3 isn't up to par, the 390X performs around the 970 when relatively the 390X performs just slightly above a 980 or on par.

World of tanks ? Give me a proper game.
Don't know about Metro:LLR, anyone got some newer metro benches?
This could be a another case of bad choice of games for one team.

Ends up costing the same as local after shipping / conversion / GST is tallied up.

It's a bit slower on paper, but faster in most games.

It's a fair game.

Metro, Witcher 3 are notoriously bad games for AMD cards.

And world of tanks is, like you said, a 'who gives a shit' game.

ok so I wasn't around during last gen's launch. did they start with inflated prices aswell?
like the 1070 right now seems about 100$ higher than it should be and I'm wondering if and when prices will go down.

Another paperlaunch card, great.

>polsky gaming site with low rep
yeah m8 naw they are known for being retards on game reviews and now ppl believe their totaly legit 480 leak
at least post the full picture nivida shill but you donĀ“t want to post the full thing because ppl see how bad the polskys did their job not even know how to write benchmark names....

Leaning towards Nvidia this time since the AMD cards look shit, but what about down the road? The 680 and 7970 traded blows back in the day, but now the 7970 wrecks the 680. Do Nvidia cards have some sort of longevity issue?

Nvidia have 10 year long driver updates!
AMD doens't update 2 year old cards LOL

More like 15% worse than a 970

Nvidia has a monopoly issue where they gimp their older cards on purpose so people upgrade. And since AMD isn't competition, they're forced to upgrade to nvidia cards.

Although the reality is more like this: Nvidia doesn't bother improving older gen cards when the newer ones are out, but AMD spends time and money improving drivers for cards that have already been bought.

Kyle Bennett's doomsday predictions for AMD are seeming more legitimate with every passing hour.

Maybe Polaris really is a house fire flop.

The house fire memes are really being misapplied. The RX480 uses less than 150W.
You know why the GTX 480 was called a housefire?
BECAUSE IT USED 250 FUCKING WATTS AND GOT TO 98C

>Extreme

Do they mean Ultra? Because every Firestrike Extreme benchmark we've seen so far has it at about 5900-6000. Benchmarks from Ultra showed it nearing 2900, which would indicate the numbers here been 2-300 lower. The leaked benchmarks have been a little all over the place, but the ones posted here seem incongruous with the rest.

You also have to consider the next housefire reference card - the 290x - is a monstrously bigger chip and many years newer (as well as having power draw not too far removed from big kepler).

Fermi really will stand the test of time as a housefire card

I didn't believe him, but he warned us. He was right.

pretty safe to say that AMD and Nvidia aren't even attending the same parties anymore

seems like Kyle was right

lmao AMD BTFO into orbit

I bought a 1070 msi, however I have an i5-4570 4x3,2 GHz abd 8GB Ram.
Will my setup bottleneck me a lot or is it going to be bearable. I want to play Total War Games aswell.

It's fine.

I'll add a bit more.
The Intel CPUs that are barely starting to cause significant bottlenecks are Sandy Bridge based ones.

youtube.com/watch?v=4sx1kLGVAF0

Your i5 shouldn't be that far behind the i7 of the same gen.

>The Intel CPUs that are barely starting to cause significant bottlenecks are Sandy Bridge based ones.

If you believe this then you are a fool.

>i7 5930k + 1080
>92.4/61.6

>i5 4690 + 1080
>87.1/54.1

What is that? A 5-8% difference between an enthusiast class and a peasant class? I'll stick with what I said. Especially considering that you just brought up one of the most extreme examples of a CPU intensive game.

You clearly don't know what that chart is showing - namely an enormous cpu bottleneck on the high end. When the resolution increases the delta between gpus paired with the 5930k increases as the cpu bound situation turns into a gpu bound one.

You can get similar results in ashes the ingame benchmark for that does actually show how cpu bound you are.

I never said anything good or bad about nvidia or amd faggot. It's not my fault Nvidia's tessellated cooler shrouds look like edgy gaymur bullshit

Nonsense dude, you would die if you gaymed on a 150-200W graphics card then. (300-150 W full PC, and you got to add the LCD).

Will probably cost more for same performance, knowing Nvidia.

I don't know why AMD shills refuse to believe something like this could happen when it was super obvious, anyone with a brain could see this coming, they are one refresh behind Nvidia when it comes to architecture wise, they thought that rebrading the power hungry 200 would be good move and now the power hungry dog is coming back to bite their lazy ass. Meanwhile Nvidia decided to maximize performance/watt and then move to a new shrink node plus improvements on the caches and efficient algorithm and math pipelines and delta compression which Nvidia already knew they could further improve based on Maxwell brings a huge performance boost for Pascal.

In other words AMD skipped their "maxwell" the risk of jumping to a shrink node plus having a major architectural overhaul was a stupid decision and now they are behind Nvidia by a huge margin and it would be very unlikely the gap will close.

>they thought that rebrading the power hungry 200 would be good move

Considering AMD gained marketshare for the 3xx series it worked to a degree.

>Comparing against overclocked cards instead of reference.

That shitty magazine could not even get the 3D Mark scores printed correctly (3 instances of Firestrike shown for the 480).

Tomorrow will give us more accurate figures. Stop looking at shitty bait sites.

a bit oot, but I ordered a Nvidia built 1070, compared to maybe an MSI, or even AZUS built 1070s, will it be better or worse than those by msi or azus?

So they are releasing this shitty little card before the 1080ti?

How long am I going to have to wait?

I was thinking about this. Couldn't it be that the performance of AMD cards increased due to faster CPU's?

AMD put their eggs in the 22nm basket, dude.

When you ask "What 22nm?", that's why AMD has been lackluster.
That and the lack of monye

No. It's just AMD has less resources to optimise their drivers fast. So it happens slower.
Nvidia drivers start better optimised, but when AMD's drivers improve the gap shrinks dramatically.

It doesn't help that nvidia actively sabotages its back catalogue either

>AMD put their eggs in the 22nm basket, dude

What? AMD haven't released a single 22nm product to date. Polaris is 14nm.

So its about 980 level.

That's the fucking point, dipshit. The 22nm process fell through and all the work they put into 22nm cards was wasted.

Expect to pay $350 at launch for the founders edition (if you can find any).

You're an idiot, 14nm is better, why would they waste time on 22nm you literal retard.

Are you retarded?

They were working on 22nm back during the 200 series. But the process fell through and they had to wait until 14nm.

Do you seriously not understand time as a factor? AMD wanted to go from 28nm to 22nm, then to 14nm when that became available.

Look I just don't know how I can make this any clearer, you're being purposefully stupid.

Back to ribbit you fucken retard.

Contrary to what says without further information its hard to isolate exactly where the increase is coming from - the following are all viable factors-

1) faster cpus used in benchmarking
2) AMD's own software improvements in the driver
3) Game engines moving towards workloads GCN is good at

Most likely its a combination of all three but there is no real way to find out without extensive testing.

Pic related is a hilarious example of either AMD's gains or Nvidia's gimping.

Provide a single shred of evidence to back up your claim that AMD invested heavily in 22nm and had to scrap it all.