'6Gb is enough for any game' Nvidia said

'6Gb is enough for any game' Nvidia said.

digitaltrends.com/computing/amd-vega-graphics-chip-launch-windows-confirmed/

16GB HBM2 on Vega 10.

Other urls found in this thread:

computerbase.de/2016-09/grafikkarten-speicher-vram-test/
imgur.com/gallery/Bk4NO
wccftech.com/nvidia-gtx-1080-ti-gp102-specs-leaked/
en.wikipedia.org/wiki/GeForce_700_series
youtube.com/watch?v=SUEXhu3tm3M
pclab.pl/art60000-21.html
twitter.com/SFWRedditVideos

Fuck me. The only thing that makes me feel okay with my Fury X is that the Vega 10 is also supposedly going to only have 4096 stream processors. If that's true, I don't imagine the jump from a Fury X will be worth it.
Also to consider is that I've got a custom cooling loop, and getting a new GPU would mean also needing a new block and back plate.

games are made with lowest common denominator in mind. if consoles and nvidia cards don't have more than 6 or 8 then that's the useful limit

Owning 2 AMD cards I must admit it is becoming a meme. Instead of making the GPU more powerful let's just throw more vram at it because biggest numbers are better right? AMD need to fucking sort their shit out. Vega better fucking deliver because you know just as soon as it drops a few weeks later (or less) Nvidia will drop the 1080Ti and wipe the floor with it.

>video cards have 16gb of memory now
>i just upgraded my computer to 4gb ram

More vram won't do shit.

Is HBM2 something to look forward or just a meme ?
I skipped this generation to get HBM2 memory in med-low end GPUs.

computerbase.de/2016-09/grafikkarten-speicher-vram-test/

tl;dt

More VRAM is better.
More FAST VRAM is even better still.

Wipe the floor with a $600+ gpu? How many 1080 builds have you seen? Yeah I knew, only a few. Most of the PC builds created with pcpartpicker are under $1000. The market for 1080ti will be tiny.

...

Giving that card only 8 GB HBM2 would not even make the card $5 cheaper.

HBM2 is dirt cheap to produce compared to GDDR5 or HBM1.

Only AMD fags would fall for this specsmanship

MORE CORES
MORE VRAM

finally maybe i will get a gpu that want get bottle necked by its vram, the 290x is still a great card but it gets bottlenecked by only having 4 gb of ram

Turn the texture quality down.

That is literally how GPU design works.
GPUs are massively parallel processors.

TL;DR for those that don't speak German and don't want to Google translate:

3 GB is not enough for 1080p for ultra and not even high anymore. Don't buy a 3 GB card if you plan to use it for today's games. Bad idea.

4 GB is the same way, but it's a fair bit better in the tests. Less stuttering is evident. 6 GB is already used by games optimizing for the 390/480 with ease. The 6 GB 1060 is a bad long term investment for a card, even at 1080p. Don't bother with it at 1440p.

As a side note, you can see and understand the frametime graphs at least. Strangely, Nvidia's cards all seem to exhibit severe hitching. Obviously, the cards with lesser/slower memory are worse about it, but let's take a look at Black Ops 3.

See the latency spikes on the Nvidia cards, how the valleys and hills are much more pronounced? Now so you can see closer, flip between the Extra quality settings. AMD has superior frame pacing in this.

In Mirror's Edge, the 4 GB of the 480 was enough for Hyper textures. The 3 GB 1060 is unplayable. Hell, the 1060 can't even do Ultra.

ROTTR is mostly a tie.

Deus Ex shows the 470 destroying the 1060. It's an AMD-optimized game and Nvidia's drivers seem busted or something because the frametimes are not ideal, even on the 6 GB 1060.

This unexpectedly turned into a wall of text. I need to see more frametime analyses but it looks like Nvidia is losing on the frametime (ie smoothness) front.

>First half of 2017
fuck it lads
why AMD waiting so long just to release a 1080/1070 competitor? Nvidia will have 1080ti to fuck them over with by then. Didn't they learn from the Fury X?

you're forgetting the architectural improvements present on the 400 series gpus, the 480 has 32 rops and is on par with a 390, which has 64 rops

now let's say that the fury x will have the same 4096 SP, 64 ROPS, 256 TMUs, I think that, while compute will only be 10 tflops, performance like geometry processing and other good stuff will be much higher

How the fuck are you using that vram
Only game that uses much is Max Payne 3 and GTAV in 4k

Rise of the Tomb Raider regularly uses over 6GB on Ultra.

Except for when you wanna play Stalker with mods

Oh god yeah, the geometry will be a lot better, certainly, but worth jumping from a Fury X for?
I can't justify another stupidly expensive purchase so soon.
I mean, sure, it'll have likely been around two years when the new successor comes out (I got my Fury X on day 0) but I only got my OLC some time around... June?
I'm looking at changing out some of the tubes for hardline as it is. Adding in a new GPU, block & backplate would just be too much.

Once the 590, or perhaps even 690 comes out, then will be the right/best time to upgrade.

Only .5gb more VRAM and 970 users wouldnt get this screen

How high res are these textures man

It doesn't need to if it had 8gb (390x)

Deus Ex is also really shittily programmed, they managed to make dx12 LESS efficient than dx11

>falling for the 16GB meme
:^)

Nixxes are supposedly very good developers. They have got RotTR DX12 working quite well with my RX 480 on a Nvidia sponsored game. Give it time it's still beta.

VRAM is cheap, theres no reason NVIDIA nor AMD shouldnt include more by default.
NVIDIA in this case is just trying to defend why it refuses to give customers more when it literally would barely affect their costs of production. Reality here is NVIDIA has always been into planned obsolescence with drivers, vram, etc and they never will budge.

>Give it time it's still beta.
Oh? I haven't followed it. Why are people quoting a benchmark that's still in-dev?

If VRAM was cheap AMD would've given the RX480 HBM and made it like a Nano.

>Mirror's Edge, the 4 GB of the 480 was enough for Hyper textures.
barely its just that 3rd and 4th gen gcn has texture compression and 1st and 2nd doesnt

>I can't justify another stupidly expensive purchase so soon.

but u actually bought a fury x and being a bigger retard by putting it in ur olc desu

>1080Ti
imgur.com/gallery/Bk4NO
wccftech.com/nvidia-gtx-1080-ti-gp102-specs-leaked/
I am really hoping for a 1060 Ti and/or 1050 Ti now.

>Only uses 50w more than the rx480 but has 5x the performance
fucking KEK

>Bigger retard putting it in an OLC
Actually, its temps dropped by 20c under load.
Going from 61c to 41c at absolute maximum is hardly anything other than substantial.
That lone 120mm radiator, no matter how thick, was not enough. I mean, sure, it was enough if you kick the fan on to a higher speed, but then it sounded like a fucking jet engine.

Was the temperature throttling it? No. But having it drop like that says that there was definitely a lot more room for it to be cooler given the chance.

>1060 ti
never going to happen
>1050 ti
already out, its called the 1060 3gb

*100 Watts
and 60-50% performance at best

>never going to happen

had the 480 been faster than the 1060 it would have. there's ~10 device ids for gp104 variants but so far we've only gotten 5 (1070, 1080, mobile variants and tesla p4)

a 1070 with 2 more disabled SMs and 4gb of vram would have been a great value.

>'6Gb is enough for any game' Nvidia said.

There is no card with 6Gb of memory on the market currently.

IIRC Mirror's Edge actually overrides your texture setting if you have too little VRAM unless you untick a box, which lead to a 4GB Nvidia card beating the 8GB AMD card when the box was ticked, but the other way around when the box wasn't in Digitalfoundry's tests on hyper. I don't speak nazi but it wouldn't surprise me if they didn't test that game properly.

these things will be worthless when unlimited detail engine is released anyway

GTX1060 6GB
Fucking moron.

the 380 was faster than the 960 and there was never a 960 ti, the 280 was faster than the 760 and there was never a 760 ti

Gb =/= GB, moron. 6Gb is 768MB.

Oh don't play semantics you autistic retard. Yes, there is a difference, but did you HONESTLY think that the OP cared whether he put a big or little b?
End of the day, anyone who isn't mentally handicapped will understand it as 6GB.

>the 380 was faster than the 960

not in real world usage (i.e not pairing the card with an overclocked i7)

>and there was never a 960 ti,

970 was the 960ti. it had a discounted price relative to past x70 cards and had more disabled SMs and gimped memory bus like previous x60ti cards.

>the 280 was faster than the 760 and there was never a 760 ti

760 was a rebrand of the 660 ti, and there was a 760 ti with 1344 cuda cores.

en.wikipedia.org/wiki/GeForce_700_series

>'3.5GB is enough for any game' Nvidia said.
Fixed.

>End of the day, anyone who is a tech illiterate AMD fanboy will read it as 6GB.

Fixed that for you. OP very clearly meant Gb, otherwise they would have typed GB.

>6Gb is enough for any game Nvidia said.
This year yes, but next year, they'll say it's 8GB

I got this with my 3GB 780, all games performed brilliantly, but the moment Maxwell dropped with 4GB suddenly 3GB wasn't enough. At least getting a card with 8GB minimum will proof it for a few years more.

Uh huh. Of course they did. By the way, did you forget your medication today, Billy?

the high vram meme needs to end.

6gb is more than enough. autists just like seeing the word "ultra" in their settings when the compressed "high" textures look identical.

even the 970's 3.5gb is still enough. i've had no problem playing new games on high settings and maintaining 60 fps with an OCed 970

Its more of AMD trying to find ways to market so fanboys can have something to justify it over getting a Nvidia

same with their processors having higher clockspeeds, gaymers love that shit.

Sure it does pal

youtube.com/watch?v=SUEXhu3tm3M

Wouldn't ticking the box turn down the graphics settings for the nvidia card, meaning that it would have inherently less work to do meaning it would perform better?

>16GB HHBM2
I can finally play Fallout 3 on ultra high!

Yeah that's the point. If the box is ticked the game limits you from setting the options too high, but the UI doesn't really tell you that. The settings menu will say the textures are on hyper quality but the game will actually use lower quality textures.

It's a smart way of making people think their cards are better than they are and/or preventing people from setting the game to settings way out of their league and then complaining about "muh optimizashun" like Sup Forums loves to do.

Does that replace Crysis as the new meme?

did you not see me use the word "texture" retard?

also this is how much vram witcher 3 uses.

high vram is a shitty marketing meme.

Are your GPUs shit?
Add more Vram because bigger is better!

Here's a quick lesson in GPU scaling. In the past when everyone was gaming at 1080P and only a few nerdy types were gaming at 1600P what was called the 'Ultra' setting quickly became the 'High' setting and the 'High' setting became the new 'Medium' setting the following generation GPU wise.

This was all well and good.

However we have a new dynamic being added to the mix. The number of 1440P gamers has increased rather substantially and the proliferance of 4K TV's has pushed us into a corner with GPU performance. Gamers with cash to splash are demanding more from their GPU's architecture for a similar amount of money. Unfortunately we are not quite there yet. We have yet to get to where a single GPU can play at 4K 60 fps adequately and affordably. Especially on higher settings.

This is a big transitional period and a very bad time to be upgrading your GPU. Next year we will see 4K 144Hz monitors hitting the shops so pushing the demands on GPU architecture even more.

You know that big blockbuster movie with all the fancy CGI in it? Most of it is rendered at 2K and upscaled to 4K projection and even thne it takes large render farms a day to render maybe a few frames.

>AMD’s chief technology officer, Mark Papermaster
>Mark Papermaster
yfw papermaster confirms another paper lauch kek

Are your cpus shit?
Overclock them!

>You know that big blockbuster movie with all the fancy CGI in it? Most of it is rendered at 2K and upscaled to 4K projection and even thne it takes large render farms a day to render maybe a few frames.
not anymore with Radeon Pro SSG

Cant wait for it to be 5% better than a 1080 at twice the power usage and temps

I hope so. Avatar was only rendered at 2K. BTW I have yet to find any info on what resolution the movie Antz was rendered at. Most likely the same res as Toy Story was before it got re-rendered for HD.

why do enthusiasts act like the average consumer gives a shit about 4k? GPUs aren't going to suddenly get faster because of 144hz 4k monitors.

going 4k now basically means you're an early adopter so you have to deal with all the headaches that comes with that.

1440p is the way to go right now, 4k will remain meme status for years.

The steam survey said said fewer than 4% of gamers were using above 1080p

Haven't you seen the benchmarks? Nvidia wins everytime.. AMD users on suicide watch

AMD doesnt do paper launches, Buttcoin miners just steal everything before gaymers can buy

The Fury X is actually faster than stock 980ti in a lot games now due to driver improvements. AMD seems to be stepping up their driver game and its working.

I could run Fallout 3 on a 256 MB DDR3 8600GT back in '07...where the fuck has the time gone??

DX12 is only less efficient than DX11 if you have a Nvidya card.

Instead of forcing hardware manufacturers to work over-time and create more expensive bullshit, like, why don't video game developers just optimize their systems better so they can have better looking games run on less intensive hardware

Too much effort, doesn't bring in the money.
No, I am not joking, this is literally the reason why they don't do it.

>GTAV in 4k
Even @ 1080p GTAV uses over 4GB of vram

5x more performance?

No it doesn't you hobo.

I have a 27'' 1440p ips 120hz and a 32'' 4k ips 60hz

And I cannot wait til we get 4k,120hz.

Fuck 1440p. 4k on a 32'' is gorgeous. I'm running Windows with 200% scaling and its a dream to use.
Everything from text to icons everything looks so much better.

The only downside is that 60hz which isn't as smooth as 120hz.but I'm just gonna wait.

Tell that to my multiple monitors.

But you can't have 8gb hbm1.

...

Yeah, they usually leave out the part that it's in beta. It's pretty fucking stupid how all of these review sites are acting as though it's some kind of final product.

>not in real world usage (i.e not pairing the card with an overclocked i7)
An i5 is enough for the 380 surpass the 960

>970 was the 960ti
Nope, 970 was the 970

>760 was a rebrand of the 660 ti, and there was a 760 ti with 1344 cuda cores.
So the 760 was a 760

Rise of the tomb raider uses 7.8GB on my Gtx 1080 at 1080p with every setting maxed out.

Newer games are Vram hogs

>AMD reaffirms when its upcoming 'Polaris' graphics cards will hit the market
Don't they mean Vega?

> It will be offered in two variants: Vega 10 for the enthusiast market and Vega 11 for everyone else.
Unsubstantiated rumors, though I bet there will be two cards. Doubtful that even the cut down chip will be "mainstream" considering it will be 350+ at minimum

Does the author even know what he's fucking talking about?

>Dean McCarron, principal analyst at Mercury Research, told the IDG News Service that Nvidia’s market share decline was due to it “de-emphasizing” the sale of graphics chips in large volumes through mainstream OEMs.
Kek'd, can't compete with AMD's console volumes / pricing structure

You can definitely play games with a 2GB card still (not sure how long this will last at 1080p) but most games will come with Hyper settings that use more than 4GB soon. Mirror's Edge Catalyst and RotTR come to mind. You might care if you're PC mustard race.

Having said that, consoles have 8GB of shared memory. Some gets reserved for OS, so 4GB will get you good textures.

It'll be good on mid range GPUs, especially with texture compression

MORE LONGEVITY
MORE NVIDIA BUTTHURT

Honestly the 290x isn't really powerful enough for 8GB. The 390 and 480 aren't really either. Consider that Mirror's Edge Catalyst takes a shit when Hyper is enabled on anything but a 1070 or 1080

>I can't justify another stupidly expensive purchase so soon.
I really don't know how you could in the first place

I know you're meming, but my 4GB 470 gets the same warning. Notice that it says "more than 4GB" which clearly means exactly what it says, MORE THAN 4, not equal to 4.

Agreed with lack of VRAM thing

That 1080Ti will actually be amazing for 4K. That means we mainstream plebs will have to wait two gens for affordable 4K gaming.

>>the 380 was faster than the 960
>not in real world usage
Yes it is in games except FO4 and GTAV, aka CPU heavy titles. On AVERAGE, it is.

>An i5 is enough for the 380 surpass the 960

no it isn't. all AMD GPUs lose substantial performance unless you're using an overclocked i7.

pclab.pl/art60000-21.html

I think they could have released a version of Vega this year, but the rumor of that and the release of the 480 forced Nvidias hand and they dropped the 10xx series a little early.

AMD saw that and for whatever reason decided to postpone the launch. My guess is they had some improvements they saw they could make to Vega chips so they decided to push that through now rather than on later Vega chips.

It doesn't look like they are rushing Vega OR Zen, which gives me a sense that they might actually have something really good, or, really shit. I hope they have something good, there needs to be more competition in the market.

Vega was always planned for 2017. HBM2 isn't even being mass produced yet, and the Vega chips don't have PHY for GDDR5 or GDDR5X.

>AMD’s discrete GPU market share climbing from 26.9 percent to 34.2 percent in the second quarter of 2016 compared to last year. Nvidia dropped from 73.1 percent to 65.8 percent in the same quarter.
PREPARE FOR
T A K E D O W N
A
K
E
D
O
W
N

It was always planned officially for 2017 you are right, but earlier this year there were credible rumors they could release Vega in Nov-Dec, and this spooked some people at Nvidia I'm sure.

Who knows what they got on the chip though? HBM2 can be produced in significant numbers now I'm sure. If AMD asked Samsung and Hynix they would do it. Will the card come with it? I don't know, would be a nice surprise if it did.

Who the fuck builds a new PC every time GPU lines are refreshed? I'm still using my Sabertooth X58 w/ i7 950 build from 2010, only thing that has been upgraded is my GPU(ATI5850>GTX770>GTX1070) and shits still maxing out games no problem. The need for the latest and greatest everything is a meme, granted the shit you start off with are good in the first place.

Jesus. I've had my 760 for years and it plays everything I want on ultra settings. These cards are

so

unnecessary

I had a 760. Sold it for a 970. Huge difference, but I play things like GTAV and Witcher 3

>AMD board meeting

Nvidia
>Our sales our down 2.5%, lets release a new series of GPUs that gives the illusion of good DX11 performance, but has gimped support for DX12 due to lack of Async capable hardware, that way we can give users a reason to "upgrade" to next year's line of cards. Lets also make sure to slowly kill performance through driver updates.

>it's a by the time AMD catches up with Nvidia 1 year old card Nvdia already has a new series/Ti version available that beats it episode
>it's always this fucking episode

Don't worry, it'll be $50 cheaper than the Nvidia equivalent one so its better!

best part is all that money's made up by your power bill :^)