We only have 6 years to find a replacement for silicon

hpcwire.com/2016/07/28/transistors-wont-shrink-beyond-2021-says-final-itrs-report/

>Transistors Won’t Shrink Beyond 2021, Says Final ITRS Report

Is this it for PC?

Other urls found in this thread:

youtube.com/watch?v=yHXx3orN35Y
youtube.com/watch?v=Hrm-rPSCIBw
gansystems.com/
spectrum.ieee.org/semiconductors/devices/how-well-put-a-carbon-nanotube-computer-in-your-hand
popularmechanics.com/technology/a18493/stanford-3d-computer-chip-improves-performance/
wired.com/2015/10/ibm-gives-moores-law-new-hope-carbon-nanotube-transistors/
youtube.com/watch?v=Ts96J7HhO28
twitter.com/SFWRedditGifs

Graphite??

You mean graphene.

Literally quantum computers

good thing we have Sup Forums to bring together all of the greatest minds in technology and search for the solution then. I'm not worried

I'll get to work on a logo.

ill write the man pages

actually i dont feel like it

...

>Every computer of the same size will be relatively the same price, computers will stop getting better and thus all hardware will be on even ground; the cheapest netbooks will run the newest games, and people will no longer need to upgrade their PC.
Why aren't you embracing the computational power equilibrium?

Semen

Doesn't matter, we rarely actually need more than what we had in the 80s.

in 2021 you'll play 4K ultra high settings games on your smartphone via cloud user

Carbon.

Next stage is biocomputers

silicone

What about DNA Computing? Using DNA as a cpu, its supposedly 10 times faster than the cpu's we have now.

youtube.com/watch?v=yHXx3orN35Y

God forbid people have to start optimizing again. It's not like 80% of consumer software these days isn't deliberately gummed up, bloated shit that exists only to shill upgrades

Posted from my Core 2 Duo
Dictated but not read

Good. Maybe then these shitty devs will start bothering optimizing their shit.

I want to play it with keyboard and mouse on my own hardware. Fuck you and your cloud.

^

You don't use dna as a cpu, as it's rather inert. It is suitable as a storage device.
For processing more suitable choices would be ribozymes (catalytic rna) or proteins.

youtube.com/watch?v=Hrm-rPSCIBw

Gallium actually performs better than silicon. We use silicon because it's much easier to refine than gallium. If we can perfect the process of making gallium CPUs we would see a significant speed boost but would almost immediately run into the same problems we're having with silicon.

just sharing experiences, those things are my profession actually.
But it's a long way to biological computers, as biologists are more focused on other things.
The next big step is the dna programming language, where you type regular code that prints out meaningful sequences of dna along with all the required regulatory elements.
That way any 10 year old child could in theory print out a functional bacteria or a virus.
Some crazy shit is going on in molecular biology nowadays.

Don't worry. Lazy coders are wasting 90% of current PCs potential. Once hardware stops improving it'll force them to get off their asses and get some optimizing work done. Expect another 10-15 years of improvement after hardware stops improving.

I think the solution it's much simpler than you think. The material we need in order to replace silicon is something that can conduct electricity so perfectly and no electrical connectivity is wasted in the form of heat. Something like graphene or a superconductor would do. Create stack able dye chips to add more transistors to a single processor instead of something like a square CPU make something like a rectangular shape. AMD needs to switch to something some form of Lga soon or they wouldn't be able to continue far.

I don't get it. So an old computer can make shitty graphics? ok...

The Feds have really been watching this stuff for the past few years since it looks like any moron will be able to make a bioweapon in the near future. 9/11 already killed home chemistry.

Germanium

Oh, for a minute there, I thought you were...
>Implying
That the entire world is running out of silicon ore.
Thank God that hasn't happened yet, at least.

That computer is weaker than a SIM card, you fuck.

What if instead of making transistors smaller and smaller to fit more on a wafer, we started to make the wafers slightly bigger?

woah
we got a genius here

Caspere knew this

No quantum transistors and computing is having huge advances lately with longer stability and larger and larger QPUs already BTFOing supercomputer farms with a single QPU.
Once it becomes viable in the next 5-10 years to make small home computer sized versions we will slowly see them replacing traditional CPUs

No.

I'm not saying in 10 years that quantum compute units will be in every home nigger, I'm saying the rich will start getting a hold of crude ones at home and it will be like the 70s when computers first became accessible to the rich hobbyists, but with quantum compute units instead.
In every home is likely 20 years away and will be done out of necessity rather than because of convenience.

Also graphene and metallic hydrogen

>Gallium actually performs better than silicon. We use silicon because it's much easier to refine than gallium. If we can perfect the process of making gallium CPUs we would see a significant speed boost but would almost immediately run into the same problems we're having with silicon.
Can the same gallium nitride process used in these mosfets be used? TSMC is making these already.
gansystems.com/

Switching losses are around 10x lower than silicon mosfets. I'm not sure how it would translate into processor performance, I assume heat would be greatly reduced and you could feed them enough voltage to run at 6ghz or 8ghz. But as you say it would be a one off boost unless GaN forced them back to 180nm or something and took them a while to increase density.

quantum computing is absolutely fucking terrible for anything outside of highly specific scientific uses

I knew 10nm wouldn't be worth the wait. They are going to spend forever reiterating it with small performance increases.

Why the fuck else would you use a computer? Are you a Sup Forumsag?

lol this
look at iphone vs android, the thing has 1/4 of the ram and its still faster

I really hardware will stop improving, so devs can finally debug their software to run it better instead of relying on hardware

this. hopefully devs will actually optimize now.

Does this mean consoles will finally catch up with PCs reasonably well? Or will gaming laptops become powerful enough to replace consoles?

lol

Just because transistor size has stopped shrinking, doesn't mean that cpu/gpu optimization with each new generation will stop.
New architectures will be better at more and more tasks, and other items such as Branch prediction can be improved. Just because we've hit a size limit, doesn't mean everything is going to preform at the same level.

t. JavaScript programmer

The next paradigm will be 3D stacked carbon nanotube chips, which will be 10000x better than silicon. Although we dont when these will come out. But theres been alot of progress in CNT logic gates.

spectrum.ieee.org/semiconductors/devices/how-well-put-a-carbon-nanotube-computer-in-your-hand

popularmechanics.com/technology/a18493/stanford-3d-computer-chip-improves-performance/

wired.com/2015/10/ibm-gives-moores-law-new-hope-carbon-nanotube-transistors/

Transistors not shrinking is actualy most bad for mobile devices, not for PCs. Chips just get bigger and require more power as the transistor count increases.

There's a point at which shit can't possibly shrink and be useful.

And I quite honestly don't have a problem with that.

>Cars won't shrink beyond 1998
Is this it for trucks?

How long until the wheel size stops shrinking and all existing tech explode because moors law.

Graphite bre, we'll use pencils. Fuk those computers

Lmao!!

People won't break into you're house and steal your USB/Bluetooth/WiFi/LTE/LaserIO keyboard and mouse for fuck sake

What do I do if playing games on my hardware no longer Is available. I.E Steam dissapears

underrated post

*sigh*

Can't wait for the Cloud Dystopia.

SAVE US FROM THE BURDEN OF TOOLS GOOGLE AND MICROSOFT, WE DON'T NEED NO MEANS OF PRODUCTION!

>gallium
>melts at body temperature
Whoops, didn't attach my 2 meter tall CPU heat sink properly, time to get a new CPU.

This.

Does that mean I'd have to feed my computer?
More importantly does this mean I could rape my computer by ejaculating into it?

>no LN setup
>being poor

I'll sick the CJWs on you faggot! Don't you dare!

Cumlord Justice Wankers can't stop me. I'm untouchable

>not jizz
Seriously?

Kek no, if we run out of advancement in hardware, developers might actually start to code properly.

There's no need to be rude, nigger

B A S E D S A A S

A

S

E

D

S

A

A

S

wut

Shit analogy because car wheel sizes have been going up
You can buy a Civic now with 19" wheels

Every die shrink from here on out will have progressively fewer gains. It's entirely possible this date gets pushed back because nobody wants to spend the money to upgrade both consumer and fabrication plants.

>Sup Forums realised the obvious from 2009 *TODAY*
wtf I thought you knew

If we invest in a decent replacement for slow hipster languages like JavaScript, we'll be fine!

>computer gets hot
>the cpu melts
what a great replacement

Stalinium conductors

nanowire transistors
finfet gates engulf three sides of the channel, nanowire gates engulf four sides of the channel

3gb ram and faster then a iphone here

This, remember what 1MHz and 64k used to do?

i'm pretty sure you can't even make a text editor that runs on such specs, yet alone a game

oh boy, maybe now we can finally step up to technology developed in the 1960s rather than the 1950s

youtube.com/watch?v=Ts96J7HhO28

Italian kek

People did with less. and frankly 1 MHz is surprisingly fast.
If my math is correct:
Instructions need about 6 cycles to execute , not pipelined ,variable lenght instruction case.
1 MHz is 1 million pulses per second
1 Cycle is around half of that because for a full cycle you need 2 pulses
half a million cycles , 500 000/6 = about 83333 instructions/second not counting in the ram and instruction specific time delays.
1x10^9 nanoseconds (1 second) / 83333 instructions = about 12000 nanoseconds per instruction.
Assume that the instruction accesses RAM for about 3 times(fetch , fetch operands , write to ram ) and that each access takes about a 1000 nanoseconds, 3000 ns + 12000 ns/instruction = 15000 ns/instruction , 1x10^9 nanoseconds / 15000 ns/instruction = about 66666 instructions/second. there are other factors that could induce delays , but the number should not go below 25000 instructions/second.
a text editor is mostly interrupt handling (in those days at least) so its quite a bad example. But the main loop that copies the data is about 6
instructions
move characterFrom to characterTo
add characterFromAddress,1
add characterToAddress,1
add registerThatKeepTrackOfSize,1
compare SizeRegister To Size
if not bigger then jump to start
66666 ins - 1 second
6 ins - x seconds
x = 6/66666 = 0.0000900009 seconds
1 second = 1000 milliseconds
0.00009 seconds *1000 milliseconds = 0.09 milliseconds.
For a reference 276 milliseconds is the average human reaction time.

To add the loop is for a copy memory function , its the core of the program but i assume that in a text editor you would use text copying quite a lot

Why don't we just move over to quantum computers once we get quantum side effects?

>Also graphene and metallic hydrogen
Ahahahahahahahaha.
No. Especially not the latter.

>video games
>photo editing
>video editing
>office suites
>web browsing
>voip services

the list go on and on. You do realize most computers in the world are used for normal tasks right?

More like we have to find a replacement right now because it's going to take a long tome to develop the new material into a working chip.

We won't be replacing it silicon with another semiconductor, however, since at 7nm chip features are only 10 atoms across. I think it's time to start looking into something other than electricity to do our computing.

I for once welcome my old book overlords.

>Does that mean I'd have to feed my computer?

You already have to feed it electrical power.

In the future, you'll also be able to use leftover pizzas. Your PC will double as a trash compactor / recycling machine.

Will it be connected to the plumbing, too, or do I have to clean up after it/take it out?

No, you have to change your computer's diaper

>2021
>Not owning every color of Lamy Safari
Graphite is for schoolchildren, math students and the sexually confused