Transistors Will Stop Shrinking in 2021

>tfw the industry has abandoned hope for sub-10nm process after disappointing results
>tfw 10nm is the smallest we're going to get until the semiconductor material changes completely

spectrum.ieee.org/tech-talk/computing/hardware/transistors-will-stop-shrinking-in-2021-moores-law-roadmap-predicts

Other urls found in this thread:

pcmag.com/news/345934/microsoft-stores-200mb-of-data-on-strands-of-dna
twitter.com/NSFWRedditVideo

Eat shit, singularityfags.

>10nm is the smallest we're going to get until the semiconductor material changes completely
Then the semiconductor material will change completely. The industry is far too large with far too much money to throw at research for everyone to just throw up their hands and say "welp we hit the limit of silicon better close down shop"

find or fabricate a replacement semiconductor

Graphene processors when?

AMD will save us

If 10 nm is the best we can do they can just keep pumping out millions of units of chips each year per demand desu, not being able to improve doesn't mean stopping production desu

A die shrink costs billions by itself, then it takes years of research and costs billions to do a change in the materials used in transistors since the lithography changes so much
graphene likes to be graphite too much for anything useful.

It does mean the end of doubling performance year over year, and the end of solid power consumption drops.

this just wait for year 2653

They've been researching this forever.

There has been working lab samples of graphene and diamond semiconductors since more than 20 years ago, but the progress has been very slow even as the industry has rapidly approached the silicon node shrink wall.

...

I, FOR ONE, WELCOME OUR NEW QUANTUM COMPUTER OVERLORDS

It's all over

Absolutely coincidentally, 2021 is the year that the last white man in the industry has been replaced with a diverse team of women and minorities.

moar coars

And they say programmers job is future proof lmao

>emulate 5nm transistors on 10nm transistors
>everything goes twice as fast

Hi there

planck scale transistors when

so its official then singularity is a pop fiction lie then

unless quantum computing doesnt turn out to be a meme computers wont get anymore powerful after 5 years only bigger

>implying AMD isn't going bankrupt by the end of this year

not programmers because "monkeys" but the rest it will be more needed because of everyone cant live without tecnology, and the robots and the 3D stuff will take over the world. the rest of the jobs (not all) will not survive more than 30 years.

I am not joking, read my words carefully because its true

>10nm

LOL

new transistor designs could in theory bring density improvements without major changes to the process itself, TSMC 16FF and samsung 14LPP are essentially planar 20nm except with finfet transistor design to give planar 14nm equivalent density.

>he fell for the d wave meme

isnt cannonlake 10nm set to release next year (2017)

10nm Cannon Lake will be mobile only. Their next desktop platform will still be 14nm.

It doesn't matter, we'll get past the limitations by then.

Already moving onto the next medium behind the scenes.

pcmag.com/news/345934/microsoft-stores-200mb-of-data-on-strands-of-dna

This trend has been obvious since 2000. I'm old enough to remember when the jigaherz were going up a ton every couple years. Now we've been stuck at 3.5ghz for ages and it's "more efficient architecture".

>I know nothing about quantum computers

we've been stuck at 5ghz for ages, not 3.5ghz. at least on intel's side, AMD only got >4ghz clocks consistently with bulldozer and probably won't replicate that with zen.

I honestly doubt that 10nm will be it for silicon, the physical limit is 7/6nm, and there is to much money in getting to that limit that they wont just sit at 10 while there is still a smaller process.

we however WILL switch over to another material, graphene is one of the more promising ones, as in labs hitting 50-100ghz is not hard on air, the issues are the stability of the material, and the fact we would be going backwards to a 30~ nm process, considering i'm on a 45nm chip and its running just fine, being able to run the chip at even 20ghz over air would EASILY make up for the larger process.

Yeah, I meant on average. Where is my stock 20ghz chip dammit

don't know that yet, but yea, honestly doubt them hitting 5ghz will be a non silicon lottery thing. that said, 4.5 is all you really need to alleviate any reasonable bottlenecks, and to that end, 3.5-4 can also do just as well, its once you hit 4.5 its diminishing returns all round.

Kek'd

For performance, wouldn't a higher frequency (on good architecture) be more important? For parallel tasks you can just throw more threads at it. I obviously understand the power/price push currently in the industry, but maybe they'll chase one when the other is at a dead end

The problem is with the end of silicon scaling, we cant go faster. The substrate simply cannot handle it, and these wide and slow architectures limit the clock speed further.

Going narrow and fast again (Neburst, Bulldozer) wont work because heat and power consumption would give us another painful bitchslap.

Different transistor structures, 3D chip construction, and maybe switching to GaAs or something similar are the only real ways forward, and all of them are going to be expensive as hell, which in turn will drive prices up for us.

Chlorine processors when?

Since humans will eventually master time travel, why haven't the humans from the future come back in time and given us future technology?

That's storage you dumbass.

We're talking about processors.

There is a theory for that

You can't actually travel back in time. You can only see a mirage from the past.

Like we do when we look at the starts right now.

Why should the industry try introducing newer fabrication methods? The manufacturers benefitting from this all have zero competition. Intel has monopoly on x86, Nvidia has monopoly on graphics accelerators. AMD barely releases anything that could drive the market forward.

This is also why earlier forecasts show more favourable node shrink predictions: a few years back, AMD had more competing products. Few years ago they still had high-end CPUs (even if they were high end only in their own way). Today they have... one new videocard, and a bunch of low-end APUs. recycled from 3 year old tech.

If there was any actual competition, you can bet your ass that Intel would sink billions into further shrinking nodes. They abandoned tick-tock because of this too: there is no gain in R&D when they are a half decade ahead of the competition in their x86 designs.

>implying the singularity won't start on a Pentium 4 running gentoo

>antisingularityfags
You niggas don't even know what the singularity is. Shit, I fell for the bait didn't I?

If FTL travel is possible on humanity's lifetime, will we have powerful enough scopes to see the history of humanity in its entirety?

>techno cultists

>what is that a-lot-more-lucrative-than-PC embedded SoC market?

They will still have plenty of leg room to improve performance and lower power consumption. Just because 10nm is the smallest they can make the chip, doesn't mean they have the optimal architecture figured out yet.

Pretty much this, the only reason lithography has advanced aside from Intel developments is the SoC market
Aside from the SoC market, lithography is used to produce pretty much any electronic, be it SSD's or whatever ASIC out there

Fuck it just make a bunch of architecture improvements and high speed die interconnects until then.

1. We can only travel a certain distance from the inception of time travel

2. We are the first instance in space and time and thus, there is no "future" us to meet.

3. Time travel will never be created

This.

Basically when the new material is stable enough everyone will use it.

Much like how Carbon Fiber and Bonded Alum took 30+ years for the auto industry to actually start using it.

Hei lysefag

if theoretically time travel is possible im sure there will most certainly be some ban on going in the past and changing shit to create outcomes beneficial to yourself. in other words with the exception of God time travel is impossible.

>graphene likes to be graphite too much for anything useful.

>mfw my AMD Zen 2 CPU overheated and turned into a pencil

5nm and you get quantum tunneling.
3nm is barely possible with different geometry for the transistor.
We're running into a brick wall with transistor sizes, the industry will stop the shrinking (unless quantum computers take off).

>5nm and you get quantum tunneling.

Insulated channel.

>3nm is barely possible with different geometry for the transistor.
GAA.

>We're running into a brick wall with transistor sizes, the industry will stop the shrinking (unless quantum computers take off).

Optoelectonic devices can scale below conventional transistors. "Quantum computers" are still using chips fabbed in silicon, the more qbits you want the larger the die will be. Guess what that means? Qbit chips will scale exactly like every single other CMOS chip. It isn't a solution to area scaling, you idiot.

Stop regurgitating things you see other tech illiterate retards post here.

...

Doesn't Intel have plans for a 5nm CPU?

>GAA
how much a benefit will this realistically offer end users?

sure leakage and switching speeds would improve, but it's not like there are massive density gains to be had, nor the speed benefits to be had from increased densities.

I don't get it.

Why do people associate transistor size with performance ? The actual size of a CPU's die is small, and in a PC as well as many other applications you aren't constrained with space anyway.

The orientation of the structures is entirely different. Instead of having enormous source and drain wells with a small switch in the middle, GAAs can be oriented vertically, or horizontally, and require a much smaller pitch metal contact. They facilitate enormous back end scaling, and drive current is in the mv range. They're the crowning gem of conventional CMOS designs.

>quantum tunneling.
>Insulated channel.
This is the dumbest thing I've ever read

Insulation won't have a noticeable effect on tunneling, go brush up on quantum theory friend. Tunneling can occur through regions of infinite potential, aka perfect insulation

Shorter distances means faster processes. Making the die bigger can only help so much.

Smaller transistors mean faster switching and lower voltages (which means less heat). Die shrinks increase yields, and are also necessary to keep the quicker clock synchronized across the entire die.

>Not having a quantum core in your cpu
>2029

>I'm a retard
Cute

Literally thats what SiGe channels are used for. The entire design behind a "quantum well FET" is to have a device completely insulated so that errant charge cannot tunnel out of the channel. A normal FinFET structure with a SiGe channel is matter of factly a QWFET device.
Tech illiterate redditor retards who spout buzzwords are in no position to discuss the nuances of advanced IC design.

That's good. This will force developers to make efficient OS and applications. You'll no longer need a new CPU every 6 years because some new OS update needs it.

but isn't the entire premise of this thread that lithography scaling is arguably becoming untenable?

how much do things like smaller pitch metal contacts matter for things like wire delay if the metal layers aren't getting denser?

>im sure there will most certainly be some ban
How the fuck do you enforce something like that? Fucking time police?

Logic core
GPU core
Quatum core
Neuron core


What did I miss?

FPU.

Time Enforcement Commission

Allahu Snackbar

time travel, if invented is almost sure going to require nation-scale or even global-scale resources and effort to perform.
Which means enforcing would be as simple as just not doing it for anyone.

This isn't really bad news for the tech singularity. Yeah transistors are now shrinking slower but performance of processors is getting better and more energy efficient.

The whole point of the tech singularity is that a computer becomes self aware and starts improving itself to become smarter than a human.

All this requires is understanding of what algorithms make humans conscious and smart. To do that we just have to simulate a human brain and find out those things in simulations.

Modern computing requirements estimates for a human brain simulation are about 1 exaFLOPS. We have 0.1 exaFLOPS of computing power with that recent chinkshit supercomputer.

Anyway I'd say we're perfectly on route to the tech singularity by 2040-2050.

Okay then how do we unify and program all this?

Imma take this opportunity to ask a question that doesnt deserve its own thread, i saw somewhere that transistors themselves are not 14/16/xx nm but instead that thats the size of the die containing the transistors, could anyone clarify or post some source where i can read about this?(i dont know how to word it for google) thanks

How long before rich people send themselves into the future with fast spacecraft?

>ctrl-f "gallium"
>no results
InGaAs is an alternative. Intel says 10nm is the end of the road for silicon alone, but gallium arsenide buys some time.

Ch-Chlorine processors, user?

now
t. Elon

no. it measures the half pitch of two memory cells

kek give me a fucking break, singularityfags

as soon as chlorine becomes an III-V semiconductor

...

I definitely think we will have AGI by 2050 for sure. I believe the whole AlphaGo thing from Google Deepmind is already promising considering its accomplishments.

You might want to brush up on quantum theory. Tunneling probability through a potential barrier decreases with the height of the potential

Holy shit it's been a while since I've seen a troll science post.

I've been privy to some work building transistors out of RNA, like most biological structures they don't have good heat resistance though.

SAVE US, IBM!

They will. But we haven't hit the point they travel back to yet.

What does this say about semiconductor testers?
I'm hoping it's business as usual for us.

What if they already did?

whatever happened to metamaterials?
also, the talk about GHz made me remember about terahertz EM waves... wonder if we will ever control and use fast photonics circuits

Might happen, but I dont think we would be able to go back in time.

Jews are still alive. They didn't.

>All this requires is understanding of what algorithms make humans conscious and smart. To do that we just have to simulate a human brain and find out those things in simulations.

You say that like it's no big deal. How would you even find, and relate to human sentience, algorithms, let alone understand them? How would you simulate a human brain? How would you turn that into human readable "algorithms"? How, if you found these algorithms and they even exist in quantifiable form, artificially recreate the plasticity of the brain i.e. create self-improving software? Plus a whole bunch of other shit. No one alive today will be around to see any of that shit. It's not like we suddenly make a computer powerful enough that it's like "oh hey it's fucking sentient, time for singularity". There's a lot more that has to go into it beyond processing power.

>GAA
That's exactly what I'm referring to.
>Optoelectonic devices can scale below conventional transistors. "Quantum computers" are still using chips fabbed in silicon, the more qbits you want the larger the die will be. Guess what that means? Qbit chips will scale exactly like every single other CMOS chip. It isn't a solution to area scaling, you idiot.
Are you fucking autistic? I'm not saying quantum computers will solve the area scaling problem. Holy fuck how dense can you be? I'm saying quantum computing will improve our processing capabilities by using a fundamentally different computing mechanism.
Fucking retard aspie thinks I'm referring to area scaling lmao.

The field is called artificial intelligence and we've already made significant progress.