<1nm

What are we going to do to continue advancing after 2022 when cpu's reach the atomic scale?

Is there any solution available?

Also does anyone have any charts that display technological advancement in the 2010's? It seems like transistor count/GFLOPS per dollar charts all stop at around 2011, is this because there has been no advancement in the last 5 years?

Other urls found in this thread:

dwavesys.com/press-releases/d-wave-systems-announces-multi-year-agreement-provide-its-technology-google-nasa-and
youtube.com/watch?v=ue4z9lB5ZHg
arstechnica.com/gadgets/2015/07/ibm-unveils-industrys-first-7nm-chip-moving-beyond-silicon/
twitter.com/SFWRedditImages

Intel will release their stashed photonic stuff and bankrupt ayyymd.

Here's a chart displaying the 'Solar Miracle' that has been happening over the last 5 years.

You can really tell this was made when intel just pushed P4 our like there no tomorrow trying to squeeze out moar gigahurtz


ILP a FLAT LINE OH BOY were they retarded.

Here's china's cumulative solar capacity to show the exponential trend.

The last 5 years of solar growth are astounding and completely blew out even the most optimistic predictions.

One more, this is Actual vs the 2000, 2002, 2005 and 2007 predictions for solar capacity, hence why it's known as a 'miracle'

...

This is $ per GFLOP graphed onto a linear, and logarithmic scale.
Over 50 years its gone from $1.1 trillion dollars per GFLOP of processing power, to about $0.04 in 2016.

This is the TOP500 supercomputer list on a logarithmic scale, displaying the Sum of all 500, the #1 Supercomputer, and the #500 Supercomputer.

Also included are points for the upcoming 'Aurora' and 'Summit' supercomputers, paid for by the Obama administration and currently in progress.

This ones only semi technology related, but it displays the number of buildings over 200 meters tall produced worldwide per year, and the cumulative total.

Between 2010-2016 we have built as many skyscrapers as were produced between 1960-2010

...

Pic related, the Shanghai tower, completed in July this year, now the 2nd tallest building in the world.

This chart from the Nvidia Computex Keynote displays the explosion of growth for mobile computing devices since 2010.

This chart shows the lowering cost of $ per GB of storage, including a trend line showing what could have been, had it not been for the tsunami

Some solutions

- alternative materials: GaAs (doable) and InP (very hard)
- alternative materials 2: switch to superconductors (quite hard but was done nearly 30 years ago now in Japan)
- alternative materials 2b: based on superconductors use RSFQ logic (theory is about 40 years old, has been tested, not trivial)
- stacking: in-silicon vias, stacking wafers to increase density
- stacking 2: place transistors above wiring layers and then new wiring (very hard)
- architecture change 1: drop x86, introcuce flag days to drop misfeatures in x64 and Itanium (technically not hard, way overdue)
- architecture change 2: use dual rail logic
- architecture change 3: use self clocking logic
- architecture change 4: combine the two above (papers on this was published in the mid-90s
- architecture change 5: combine the above (arch 4) with RSFQ logic (100 GHz is expected to be possible)

As you can see, in /sci/ence many ways forward are known, we just wonder when en/g/ineering will use these.

Looks like it's falling.

14nm is just marketing
The width of the fins is 7nm and the distance between fins is something huge like 60nm.
Thete is still a lot of room for improvements.

Its called optoelectronics.
Photonic logic gates have lower energy per op than any transistor can because exciting a photon requires magnitudes less energy than influencing electrical resistance in a channel. Its the only thing that covers both an increase in area scaling, and facilitates radically increasing density in the Z plane. Not many devices other than vertical GAAs have the same characteristics, and wires have a finite size.

>physics laws
>subject to change

There already is a quantum computer commercially available by D-wave systems and the NSA probably has one as well

>he fell for the dwave meme

Life must be fun being so naive.

This chart shows the peak GFLOPs performance of high end GPU's, for reference the newly released 1080 would be around 11,000 on this scale.

>D-wave doesn't work
Is that why both NASA and google are buying from them? So you think those organisations don't do due diligence?
dwavesys.com/press-releases/d-wave-systems-announces-multi-year-agreement-provide-its-technology-google-nasa-and

D-wave shill please, you got better places to spend your work hours on, tells your bosses.

Quantum computing for now is nothing more than popsci garbage.

What does Sup Forums think of the 'Singularity'?

While I agree with the reasoning that technology has been advancing unbelievably fast, and that the differences between now and 20 years is ago is akin to the difference that would separate hundreds of years previously, it seems a little wishful doesn't it?

I think the more realistic idea is that we'll look back on this 40 year period of 1980-2020 and see it as a technology boom that can't be reproduced, the alternative is a little too ridiculous to be taken seriously.

what are you talking about? We can't make transistors smaller than 5nm because of quantum tunneling.

This user kinda listed all the viable ones: Thing is for most of those optics based transistors you need really cold temperatures, hence SC and RSFQ.

Make CPUs physically bigger and keep a bigger process.

There's single atom transistor for "quantum computing"
youtube.com/watch?v=ue4z9lB5ZHg

Concurrency is one way to solve it. You could also squeeze a small performance increase by improving the architecture and optimizing software layers.

MOAR CORES will probably be the trend instead of relying on competent programmers.

Not a graph, but an example of real time rendering in video games improving between 1993 and 2016.

You can plot any two things together on a graph and show a non existent trend. The abstract of "events" in this case is a nonsensical quantity, its subjective. Kurzweil does this a lot to show exponential growth where non exists because he needs to force everything to fit a narrative since hes profiting from it.
Serial integer performance has been suffering the effects of diminishing returns. Its not an issue that pursuing higher clocks can ever solve either. We can't feed instructions to executing logic fast enough, or substantially faster than we already do. Its the reason why intel will only muster 10%-3% IPC uplift per generation while targeting improvements in perf/watt.

> We can't make transistors smaller than 5nm because of quantum tunneling.

Stop regurgitating terms you know literally nothing about. Stop it.

This graph shows yearly Research and Development spending by a selection of countries as a percentage of GDP

This is an interesting one, it displays GDP per capita from the year 1500 til 2000.

This is the UK life expectancy over time, from 1930 until 2013.

Related, prediction of Centenarian rates to the year 2100.

The first-worlder born in 1995 has a 27% chance of living to 100, excluding possible medical advancements in future.

This graph predicts the worldwide Exabytes produced per month.

An Exabyte is 1000 Petabytes, and a Petabyte is 1000 Terabytes.

In 1986, worldwide total storage capacity was 2.6 Exabytes.

This graph estimates total worldwide data, cumulative.

In 2013 it was 4 Zettabytes, in 2016 it's 13 Zettabytes, and by 2020 IDC predicts it will have expanded to approximately 45 Zettabytes.

1 Zettabyte is 1000 Exabytes.

This graph shows worldwide data demands, compared with actual storage production.

As you can see the world is rapidly running out of storage, and will soon be unable to store all of the data it produces.

well, it's sort of "emulating" a quantum computer for concept. But it still uses single transisitors like any normal pc unfortunately.

Serious question, how do Quantum Computers help us continue to progress? For instance lets say a 'quantum computer' is released in 2022 equivalent to the best 2nm architecture available at that time, where would we go from there with the Quantum computer that we couldn't go before?

>trend if no crisis
Those floods were bullshit.

The entire world storage market has been permanently crippled by those floods.

Isn't this just because we forget things at an exponential rate relative to how long ago they happened? I'll bet the move from vacuum tubes to silicon is on that chart but the move from mud huts to mud-and-grass huts isn't.

This chart is similar in concept, but better labelled

This chart is more presently focused, showing lowering amounts of time for 'paradigm shift technologies' to become ubiquitous.

QC can't run crysis. That's all you need to know.

Damn, didn't know Canonical did all these things.
Really makes you think.

No, what they did cripple is your head by getting you believe this wasn't a huge scam to raise prices.

The entire concept of a Singularity still seems ridiculous however. Imagine it as it would actually have to be, in the real world, if some new 'paradigm shift' was introduced literally every day the population physically wouldn't keep up, we can't all rush out to the stores to blow $1000 on the newest brain chip thats 10,000x better than yesterdays model every 24 hours.

I personally know someone who has knowledge of their photonics stuff. Whenever they do start using it in their chips they will blow anything else out of the water.

In this case, will you tell me about 'Photonics'? Same case as the Quantum Computing, how will they allow us to advance in 2022? Can we expect crazy gains?

The points are still arbitrary though, you could make the graph say anything if you picked the right events. Like how the fuck is HDTV a "paradigm shift?" It's just regular tv but slightly better.
oh yeah? well my dad works at nintendo.

I agree with you, but I must say that even removing some of the stranger points of the graph, it still makes a solid point that 'large technology changes' are happening more and more frequently as time goes on.

To use a common example, bring a man from 1990 into today, and he'll be amazed at our smart phones, super computers and the internet.

Bring a man from 1800 to 1945, and he'll be amazed at flight, cars, and steam power.

Bring a man from 1500 to 1700, and he might not be so surprised.

500 to 1500, 500 B.C to 500 A.D

100,000 B.C to 2000 B.C

The scale of 'change' as a concept has shrank from millenia to centuries to decades to years.

Sounds pretty dank to me
Intel probably has that shit already and we just have to wait a million years for AMD to catch up

The tsunami is just part of the truth. Demand also went significantly down thanks to SSDs.

They won't actually be able to go down to 1nm, before that they'll have problems with quantum tunneling. We'll see 10nm chips eventually, and probably even 7nm, but anything below that would be pretty much impossible to use, because electrons would just randomly tunnel through into other lanes and fuck everything up thanks to quantum mechanics.

After 7nm, they'll need to start thinking about new architectures, different materials, and more cores to get any improvements in CPU power. I predict that Moore's Law will completely collapse upon the release of 7nm CPUs; progress will essentially cease until loads of money are sunk into R&D and it would take years to actually bring any product to market. If AMD manages to stay in business until then, they'll probably end up catching up to Intel. There will probably be a 7nm equivalent to the 2500k which lasts a fucking decade or more due to stagnation.

GloFo is skipping 10nm and going straight to 7nm. So 7nm might be sooner than we think.

It'd be interesting to see how processing speed stagnation will affect software devs.

They'll finally be forced to optimise their shitty code again.

Also the mobile industry still has a lot of catching up to do so the it'll probably boom greatly.

The right still looks shit.

I'll never understand posts like this.

You think the high poly 3D render on the right, with dynamic lighting, ambient occlusion, tessellation and anti-aliasing, when compared with a 2 dimensional image rendered in the 90's comprising of about 800x600 pixels, "looks like shit"?

spec dont mean shit in real life. its like how the iphne still BTFO every android device with its "inferior hardware"

Well most computing these days involves networking and they found that the single slowest link in the process was the conversion of optical signals into electronic signals at the network edge. So they decided why not just make a computer out of photonic logic gates so the entire network + computer was just based on optical signals. The only real advantage you'll see is that you can play counterstrike online at 0.001ms latency but that's about it.


I'll try and summarize the three types of computing by explaining how a basic transistor would work in each:

>Electronic:
The control bit can activate or de-activate the gate by it's electrostatic field either depleting or replenishing the channel between source and gate with carrier electrons.

>Photonic:
The source beam and the control beam interact in a Non-linear optic medium. Depending on the presence or absence of the control beam, a signal beam may be generated at a higher harmonic and then down-converted so it can be fed into the next gate as either a control or source

>Quantum:
If we have two qbits then we need to find an interaction between them such that the state of qbit 1 will influence the state of qbit 2 as it evolves in time. So our gate needs to put qbit 2 into a state where it will evolve at a different rate depending on the sate of qbit 1. Then we catch qbit 2 after it has evolved for a certain amount of time and it will have a particular correlation with qbit 1. For example, in research so far, a typical gate has been achieved with either the exchange interaction or the dipole-dipole interaction and manipulating the spin is done with radio-frequency pulses.

quantum computing seems very interesting then, thanks for taking the time to type that up for us user

...

the R&D going into alternatives to silicone is growing, by the time 7nm will be commercially available I think we will have a silicone alternative
I doubt that Moore's law will collapse, giants like Intel are already pouring money into R&D for alternatives

That curve for India is suspiciously smooth. The double plateau for Japan is also strange.

Don't reply to bait

>can play counterstrike online at 0.001ms latency
Nah, the speed of light isn't going to get any faster.

It's not 800×600 its 320x200

>the population physically wouldn't keep up
To some extent we have already reached the point where large parts of the population cannot keep up.

The segment that is the most noticeable is the segment with politicians and the chattering classes. You know, those who want to control the people and the future.

Again, stop regurgitating terms you know literally nothing about.
There is no issue with electron tunneling in an insulated device. 5nm and 3nm GAAs are not an issue.

You're a popsci meme spewing redditor.

>typing silicone
>twice

Glofo, Samsung and IBM collaboratively made a working 7nm chip 5 years ago. Which back then they estimated commercial viability in 2018. Uses a germanium silica alloy.

>Intel probably has that
Unlikely. Recent history (like the x64) shows they have fossilised in their niche: ancient architecture propped up with extreme semiconductor technology.

I never saw a single publication from Intel on superconductivity. Hypres already makes chips, Intel is not moving out of semiconductors.

Its called SiGe, silicon-germanium, and its widely used in everything. Every 28nm mobile chip on the market has SiGe insulation in it.

IBM's 7nm process uses a SiGe channel which is one of the only notable uses of it. They announced production of functioning chips on the process only last year.

>Sup Forums really is this clueless

People are unwilling to buy a new PC every 3 years so I'm sure that won't happen.

Yeah 2020

>Soon.

65 nm – 2006
45 nm – 2008
32 nm – 2010
22 nm – 2012
14 nm – 2014
10 nm – 2017
7 nm – ~2019

The problem is starting with 2006 there was a gradual decrease in usefulness of going smaller the gains just aren't there anymore and it costs a shitload...

My mistake. Slip of the finger on mobile.

arstechnica.com/gadgets/2015/07/ibm-unveils-industrys-first-7nm-chip-moving-beyond-silicon/

>knowing the abbreviation makes me smart.

Well this doesn't actually tell the whole story.

A lot of shit is made in 22nm even today.

>make an ass of yourself
>try to behave like a smartass when called out
>link an ars article in true pleb fashion

Pottery in motion

this. one example, companies that have moved their shitty websites made in, say, ruby+rails, to performant languages (golang for example) have seen their server usage reduced to 1/8.

also this. why are we even using java in phones? there are many safe languages already, and java on top of linux is... basically a big, slow wrapper

Sure. Analogue electronics typically use even larger dimensions.

I think we won't have to go down to those sizes that are hard to engineer due to quantum mechanics in the foreseeable future. I think instead switching to a different strategy like other semiconductors than silicon, or eventually switching to optical instead of electrical processing units, is a better option. That way, even if processors keep the same number of transistors they'll still be way faster and more efficient because we can increase clock speed compared to silicon CMOS

As far as I know, QC is only useful for certain very specialized problems. Here's an analogy: QCs are to CCs as injection molding machines are to lathes or mills. They can be much faster, but are far less flexible.