Nvidia to use Samsung fabs for next-gen GPUs

>Samsung would start making the next-generation GPUs using its 14-nanometre production technology before year-end, based on the U.S. company's Pascal architecture.

reuters.com/article/us-samsung-elec-nvidia-idUSKCN10N0L0

How long before TSMC and AMD are both kill?

Other urls found in this thread:

reuters.com/article/us-samsung-elec-nvidia-idUSKCN10N0L0
forbes.com/sites/patrickmoorhead/2016/07/25/amd-diversifies-14nm-manufacturing-with-samsung/
twitter.com/NSFWRedditVideo

AMD is already dead

14nm Tegra chip is a possibility

So there will be a Pascal v2? NOO! I wasted all my NEETbux on the GTX 1070.

Probably the Ti

>Falling for the first gen process node scam
Never buy into the first generation of a node drop, idiot.

>>Never buy into the first generation of a node drop, idiot

>second gen is usually a rehash, might as well get first gen and avoid waiting a year

>third gen is the best, but you might as well wait for the new node idiot

>reuters.com/article/us-samsung-elec-nvidia-idUSKCN10N0L0

16/14=1.14

DOES THIS MEAN THE GPU'S WILL BE 14% FASTER THAN CURRENT PASCAL?

I'd rather take the rehash over the original drop, because there's usually all sorts of hidden issues in the initial node drop.
Rehash fixes 'em.
Third gen IS best, and no, its not worth waiting for the next node drop because that could be a long, long time away.

No.

Samsung's current 14nm process results in similar thermal performance as TSMC's current 16nm process.

This new 14nm process they're talking about though, it might be superior to the current process, but we don't know by how much.

I hope so, also hope it will be very expensive.

Why are you so mean? Volta is only in 2018, i'm not waiting that long.

14nm is a meme. Skylake and Polaris are both shit.

>tfw your patience paid off
>tfw it feels good to not be a full blown autismo with impulse issues

>using anime picture
>not full blown autist
pick 1

It will probably be for the GTX 1050. Samsung 14nm a shit, no really.

>implying anyone cares about a degenerate frog poster

>all sorts of hidden issues
Its nvida man all of their shit just works.

so are samsung and intel the two best in the world at making chips?

:^)
Coming from one of the biggest known Nvidia shills, I completely believe you. Yes, yes I do.

>pot calling the kettle black

Samsung only has one 14nm process with a few variants. There are no new ones, aside from the compact low cost announced a long time ago.

The 1050 is just a binned 1060, and you're too dumb to understand electrostatic characterization of one process vs another.

AYYMD FINISHED AND BANKRUPT

Intel still has the best fabs.

I thought Amd was using goflo

The iPhone 6S uses either a 16nm (TSMC) or 14 nm (Samsung) process for its system on a chip. It's pretty much the same performance and power efficiency.

This is basically how I save money.

>Pascal v2
so Maxwell v3?

so still no Async or DX12?

1/2

New high performance nodes worth the investment will not be here until 7nm is ready for mass production. Everyone but Intel reached 14/16 production viability just a few months ago
We will probably see another four year cycle of the same litho process, at a minimum, because 10nm is on a similar path as 20nm was - Not very worth it beyond the smallest, lowest power devices where density and dies per wafer are critical and HPC customers which follow similar requirements.

FinFETs won't save us here (they are a one time solution). GloFo's even developed FD-SOI (traditional planar transistors with specially doped/strained silicon channels to reduce leakage) processes which rival FinFET leakage reduction and voltage minimums. At this stage some physical limits are hitting a hard wall. Wire width cannot decrease without drastically reducing amp capacity due to dangerous self-heating effects. Resistance is going up (relative to where node shrinks used to see a quadratic reduction in wire resistance and dynamic power) because while length is being reduced the width and wire walls just cannot be reduced without causing a loss in performance. At these new transistor and power densities it is becoming much more difficult to control hot spots, signal routing and timing, and power envelopes. This makes the design phase much more intricate and time consuming. A single mistake with power target, leakage control, or thermal load for any one chip domain that reaches tape-out design now ruins a chip completely. At that point a company risks either releasing a non competitive and/or potentially defect prone product (looking at you, Qualcomm 810) or being 6 months, a year or more late to market.

How nice of Nvidia to follow AMD. Again.

2/2

IMHO all the big players should be pooling their research together to find viable silicon alternatives, get off Si completely, or advance manufacturing techniques so that fundamental new design paradigms can hit the market as soon as possible. They all know perf and power improvements from here on out are on last legs. They all know that dicking around with shrinking Si based tech is going to give even less and less and less. They all know that radical arch design changes are just wishful thinking.

FD-SOI is exactly what the name implies. Its insulating layer on top of the bulk silicon. Everything is built upon and around this inherent insulation. GloFo's 22FDX isn't internally developed, its licensed and slightly tweaked STMicro IP. The only thing which really makes these modern SOI processes interesting is how they leverage body biasing.

PD-SOI and FD-SOI have been used forever.

On that note, GloFo is fabbing IBM's POWER9 chips since acquiring their foundry business. Interesting here is that the chips are most likely using a 14nm SOI FinFET process developed by IBM. It'll be the first usage of SOI based FinFETs in production ever. The FinFET transistor was first envisioned as an SOI device, but the industry moved largely to bulk, so they've never been realized until now.

Very interesting user thank you.

SOI is just another failed amd meme

SOI was common place for years, tech illiterate retard.

forbes.com/sites/patrickmoorhead/2016/07/25/amd-diversifies-14nm-manufacturing-with-samsung/

1080Ti will be Samsung fab.

I am calling it.

Pretty much.
Never buy a GPU

Nvidiots now seem to have no problem with 'Samshit'

>IMHO all the big players should be pooling their research together to find viable silicon alternatives, get off Si completely, or advance manufacturing techniques so that fundamental new design paradigms can hit the market as soon as possible. They all know perf and power improvements from here on out are on last legs. They all know that dicking around with shrinking Si based tech is going to give even less and less and less. They all know that radical arch design changes are just wishful thinking.

Why can't they move to new substrates like mosfets have? There's SiC and GaN fets now with higher switching performance.

>fucking LOLOLOLOLOLOLOLOLL

so glad i didn't fucking buy any pascals, all those retards rushing out to buy 10XX cards

HAHAHA KEEKEKEKKEKE

>Node drops are GOAT!
>Don't get the first drop!

Sup Forums is full of contrary retards.

>every nvidiot a month after buying a current gen GPU

Never change nvidiots

>Never buy into the first generation of a node drop, idiot.

just keep waiting for the next best thing...

>Nvidia to sue Samsung fabs for next-gen GPUs

I'd take korean fabs over tiwanese any day

Kinda agree with you, but I don't think it'll take four years for 7nm GPUs to show up. 28nm - 14nm was so long because of the global recession, for one reason. We're probably on tick-tock for GPUs this cycle. 7nm is supposed to be up for mass production in 2018, so my guess is that we'll see new GPUs from the node in 2019. 2016-17 for 400/1000 series, 2018 for 500/1100 series, 2019 die shrink to 7nm for 600/1200 series.

>samsung buys nvidia
>apple buys amd for $5

It's getting harder and harder to shrink the transistor size.
I'm sure we won't see 10nm till 2020

Never in my life time has there been a halving of litho size in a 24 month period. With litho facing fundamental physical problems only beatable by bleeding edge techniques we will be lucky to get 7nm at a cost effective level in 4 years.

Actually if amd is up for sale then both Samsung and apple will do anything in their powers to get it. Imagine getting that fucking insane patent bank

New Nvidia Shield table was canceled and now Samsung is making a new chispet despite the fact that Samsung was buying from TSMC?

Looks like it's going to be a new Tegra.

Who Pascal 16nm here?