The next Geforce (for gaming) doesn’t use HBM2

fudzilla.com/news/graphics/43873-next-geforce-doesn-t-use-hbm-2

Thank god, we don't have to wait®

>Of course, Nvidia won't jump and manufacture a high end Geforce card with 21 billion transistors. That would be the Volta that Nvidia CEO Jensen launched back in May. That would be both risky and expensive. One of the key reasons is that Nvidia doesn’t really have to push the technology possibilities as GP102 based 1080 Ti and Titan Xp look really good.

>Our well-informed sources tell us that the next Geforce will not use HBM 2 memory. It is too early for that, and the HBM 2 is still expensive. This is, of course, when you ask Nvidia, as AMD is committed to make the HBM 2 GPU - codenamed Vega for more than a year now. Back with "Maxwell", Nvidia committed to a better memory compression path and continued to do so with Pascal.

>The next Geforce - and its actual codename is still secret - will use GDDR5X memory as the best solution around. We can only speculate that the card is even Volta architecture Geforce VbG. The big chip that would replace the 1080 ti could end up with the Gx104 codename. It is still too early for the rumored GDDR6, that will arrive next year at the earliest.

>All eyes are on AMD, as we still have to see the Vega 10 launching. At the last financial analyst conference call, the company committed to launch the HBM 2 based Vega GPU at Siggraph. This year, Siggraph takes place between July 30 and August 3.

>July 30 and August 3

>AMD’s lack of a higher end card doesn’t really help its financial cause as you need high margin cards to improve your overall sales profits. The fact that the Radeon RX 570 and 580 are selling for miners definitely helps the RTG. The Radeon Technology Group is selling all they can make, and this is a good place to be. The delay for Vega is not so appealing, but again, if this card ends up being a good miners' card, gamers might have a hard time getting them at all.

Other urls found in this thread:

anandtech.com/show/9266/amd-hbm-deep-dive/4
twitter.com/AnonBabble

Bullshit, Hynix itself came out and said there's a GPU with GDDR6 coming out in early 2018.

Everyone and their mother said it was Volta, or more precisely, GV102.

Now when I look at the Vega 11 release date, it might actually be Vega.
Holy shit.

HBM2 availability is so shit than AMD order Samsung HBM2

and Samsung?

Samsung doesn't make 8GB HBM2 stacks that are used in Vega 10

>Fudzilla

hbm2 is overpriced shit

thnk you based nvidia

honestly doesn't matter, if they do GDDR5X on a 512-bit memory interface there will be no shortage of bandwidth.

>lets waste 100W on the memory subsystem

Lol no

>lol no
You are retarded or you fell on the HBMeme hype, pic very related. Also fun fact, GDDR5x/6 is even more efficient

your posting is becoming more efficient. as if you've done this so many times.

>low clocked DRAM
>'theoretical' HBM1
>HBM1
>1
>GDDR still loses like a nigger in every technical metric while still managing to use 15% of the GPU die for the PHY

Nah, GDDR is garbage, HBM is literally better in every single metric.

>12 GDDR5@384bit draw 31.5w
>GDDR5x draw less power than GDDR5
>GDDR is garbage

oh boy....it's not 100w

anandtech.com/show/9266/amd-hbm-deep-dive/4

>Nah, GDDR is garbage, HBM is literally better in every single metric.
People will never pay all these shekels for save like 15 watts

>People will never pay all these shekels for save like 15 watts
This, which ultimately is what it is about. Gamers don't give a fuck about the TDP of their $600+ video cards.

No it's not, because it's clocked higher and offets its voltage wins.

>lower power
>smaller controllers
>PHYs take small amount of die area leaving more to shaders
>also means not only VRAM uses less power, but the PHYs as well
>smaller form factor cards
>direct cooling
>tFAW is twice faster over GDDR*

There is literally no reason not to use HBM, besides price, and nobody cares about the extra $30-50 cost of superior technology when you're talking $600 GPUs, which are made more powerful from HBM.
Nobody cares about poorfag 200mm2 dies, those can use GDDR

Point is, HBM is technically superior to GDDR in EVERY.SINGLE.WAY, you literally cannot tell me a single metric that is in GDDR's favor.

>No it's not, because it's clocked higher and offets its voltage wins

It's not what?

It's expensive an useless at this point Raja

If we can do the same shit with a cheaper GDDR5x/6, why mainstream customers should pay more

Because you can't do the same you fucking twat, and I just literally told you why.

Nobody love HBM nor console

Yet all the pc has is shit multiports.