What are we going to do to continue advancing after 2022 when cpu's reach the atomic scale?
Is there any solution available?
Also does anyone have any charts that display technological advancement in the 2010's? It seems like transistor count/GFLOPS per dollar charts all stop at around 2011, is this because there has been no advancement in the last 5 years?
Intel will release their stashed photonic stuff and bankrupt ayyymd.
Mason Mitchell
Here's a chart displaying the 'Solar Miracle' that has been happening over the last 5 years.
Jacob Robinson
You can really tell this was made when intel just pushed P4 our like there no tomorrow trying to squeeze out moar gigahurtz
ILP a FLAT LINE OH BOY were they retarded.
Zachary Wood
Here's china's cumulative solar capacity to show the exponential trend.
The last 5 years of solar growth are astounding and completely blew out even the most optimistic predictions.
Brandon Nguyen
One more, this is Actual vs the 2000, 2002, 2005 and 2007 predictions for solar capacity, hence why it's known as a 'miracle'
Anthony Edwards
...
Alexander Hall
This is $ per GFLOP graphed onto a linear, and logarithmic scale. Over 50 years its gone from $1.1 trillion dollars per GFLOP of processing power, to about $0.04 in 2016.
Jayden Perry
This is the TOP500 supercomputer list on a logarithmic scale, displaying the Sum of all 500, the #1 Supercomputer, and the #500 Supercomputer.
Also included are points for the upcoming 'Aurora' and 'Summit' supercomputers, paid for by the Obama administration and currently in progress.
Angel Myers
This ones only semi technology related, but it displays the number of buildings over 200 meters tall produced worldwide per year, and the cumulative total.
Between 2010-2016 we have built as many skyscrapers as were produced between 1960-2010
Hudson Myers
...
Brody Hernandez
Pic related, the Shanghai tower, completed in July this year, now the 2nd tallest building in the world.
Tyler Jackson
This chart from the Nvidia Computex Keynote displays the explosion of growth for mobile computing devices since 2010.
Luis Scott
This chart shows the lowering cost of $ per GB of storage, including a trend line showing what could have been, had it not been for the tsunami
Joseph Ward
Some solutions
- alternative materials: GaAs (doable) and InP (very hard) - alternative materials 2: switch to superconductors (quite hard but was done nearly 30 years ago now in Japan) - alternative materials 2b: based on superconductors use RSFQ logic (theory is about 40 years old, has been tested, not trivial) - stacking: in-silicon vias, stacking wafers to increase density - stacking 2: place transistors above wiring layers and then new wiring (very hard) - architecture change 1: drop x86, introcuce flag days to drop misfeatures in x64 and Itanium (technically not hard, way overdue) - architecture change 2: use dual rail logic - architecture change 3: use self clocking logic - architecture change 4: combine the two above (papers on this was published in the mid-90s - architecture change 5: combine the above (arch 4) with RSFQ logic (100 GHz is expected to be possible)
As you can see, in /sci/ence many ways forward are known, we just wonder when en/g/ineering will use these.
Adam Myers
Looks like it's falling.
Dominic Howard
14nm is just marketing The width of the fins is 7nm and the distance between fins is something huge like 60nm. Thete is still a lot of room for improvements.
Adam Williams
Its called optoelectronics. Photonic logic gates have lower energy per op than any transistor can because exciting a photon requires magnitudes less energy than influencing electrical resistance in a channel. Its the only thing that covers both an increase in area scaling, and facilitates radically increasing density in the Z plane. Not many devices other than vertical GAAs have the same characteristics, and wires have a finite size.
Ayden Campbell
>physics laws >subject to change
There already is a quantum computer commercially available by D-wave systems and the NSA probably has one as well
Christopher Jackson
>he fell for the dwave meme
Jason Davis
Life must be fun being so naive.
Zachary Garcia
This chart shows the peak GFLOPs performance of high end GPU's, for reference the newly released 1080 would be around 11,000 on this scale.
D-wave shill please, you got better places to spend your work hours on, tells your bosses.
Quantum computing for now is nothing more than popsci garbage.
Aiden Cook
What does Sup Forums think of the 'Singularity'?
While I agree with the reasoning that technology has been advancing unbelievably fast, and that the differences between now and 20 years is ago is akin to the difference that would separate hundreds of years previously, it seems a little wishful doesn't it?
I think the more realistic idea is that we'll look back on this 40 year period of 1980-2020 and see it as a technology boom that can't be reproduced, the alternative is a little too ridiculous to be taken seriously.
Camden Stewart
what are you talking about? We can't make transistors smaller than 5nm because of quantum tunneling.
This user kinda listed all the viable ones: Thing is for most of those optics based transistors you need really cold temperatures, hence SC and RSFQ.
Robert Brown
Make CPUs physically bigger and keep a bigger process.
Concurrency is one way to solve it. You could also squeeze a small performance increase by improving the architecture and optimizing software layers.
MOAR CORES will probably be the trend instead of relying on competent programmers.
Joseph Cooper
Not a graph, but an example of real time rendering in video games improving between 1993 and 2016.
Landon Bailey
You can plot any two things together on a graph and show a non existent trend. The abstract of "events" in this case is a nonsensical quantity, its subjective. Kurzweil does this a lot to show exponential growth where non exists because he needs to force everything to fit a narrative since hes profiting from it. Serial integer performance has been suffering the effects of diminishing returns. Its not an issue that pursuing higher clocks can ever solve either. We can't feed instructions to executing logic fast enough, or substantially faster than we already do. Its the reason why intel will only muster 10%-3% IPC uplift per generation while targeting improvements in perf/watt.
> We can't make transistors smaller than 5nm because of quantum tunneling.
Stop regurgitating terms you know literally nothing about. Stop it.
Nathan Morales
This graph shows yearly Research and Development spending by a selection of countries as a percentage of GDP
Jack Lopez
This is an interesting one, it displays GDP per capita from the year 1500 til 2000.
Landon Phillips
This is the UK life expectancy over time, from 1930 until 2013.
Hunter Anderson
Related, prediction of Centenarian rates to the year 2100.
The first-worlder born in 1995 has a 27% chance of living to 100, excluding possible medical advancements in future.
Nathaniel Perez
This graph predicts the worldwide Exabytes produced per month.
An Exabyte is 1000 Petabytes, and a Petabyte is 1000 Terabytes.
In 1986, worldwide total storage capacity was 2.6 Exabytes.
Elijah Brown
This graph estimates total worldwide data, cumulative.
In 2013 it was 4 Zettabytes, in 2016 it's 13 Zettabytes, and by 2020 IDC predicts it will have expanded to approximately 45 Zettabytes.
1 Zettabyte is 1000 Exabytes.
Landon Edwards
This graph shows worldwide data demands, compared with actual storage production.
As you can see the world is rapidly running out of storage, and will soon be unable to store all of the data it produces.
Anthony Brown
well, it's sort of "emulating" a quantum computer for concept. But it still uses single transisitors like any normal pc unfortunately.
Jaxon Lee
Serious question, how do Quantum Computers help us continue to progress? For instance lets say a 'quantum computer' is released in 2022 equivalent to the best 2nm architecture available at that time, where would we go from there with the Quantum computer that we couldn't go before?
Ayden Howard
>trend if no crisis Those floods were bullshit.
David Phillips
The entire world storage market has been permanently crippled by those floods.
Henry Fisher
Isn't this just because we forget things at an exponential rate relative to how long ago they happened? I'll bet the move from vacuum tubes to silicon is on that chart but the move from mud huts to mud-and-grass huts isn't.
Jayden Edwards
This chart is similar in concept, but better labelled
Adam Rogers
This chart is more presently focused, showing lowering amounts of time for 'paradigm shift technologies' to become ubiquitous.
Andrew Cruz
QC can't run crysis. That's all you need to know.
Josiah Jones
Damn, didn't know Canonical did all these things. Really makes you think.
Noah Cook
No, what they did cripple is your head by getting you believe this wasn't a huge scam to raise prices.
Logan Rodriguez
The entire concept of a Singularity still seems ridiculous however. Imagine it as it would actually have to be, in the real world, if some new 'paradigm shift' was introduced literally every day the population physically wouldn't keep up, we can't all rush out to the stores to blow $1000 on the newest brain chip thats 10,000x better than yesterdays model every 24 hours.
Ian Hernandez
I personally know someone who has knowledge of their photonics stuff. Whenever they do start using it in their chips they will blow anything else out of the water.
Andrew Watson
In this case, will you tell me about 'Photonics'? Same case as the Quantum Computing, how will they allow us to advance in 2022? Can we expect crazy gains?
Grayson Harris
The points are still arbitrary though, you could make the graph say anything if you picked the right events. Like how the fuck is HDTV a "paradigm shift?" It's just regular tv but slightly better. oh yeah? well my dad works at nintendo.
William Williams
I agree with you, but I must say that even removing some of the stranger points of the graph, it still makes a solid point that 'large technology changes' are happening more and more frequently as time goes on.
To use a common example, bring a man from 1990 into today, and he'll be amazed at our smart phones, super computers and the internet.
Bring a man from 1800 to 1945, and he'll be amazed at flight, cars, and steam power.
Bring a man from 1500 to 1700, and he might not be so surprised.
500 to 1500, 500 B.C to 500 A.D
100,000 B.C to 2000 B.C
The scale of 'change' as a concept has shrank from millenia to centuries to decades to years.
Ryder Thompson
Sounds pretty dank to me Intel probably has that shit already and we just have to wait a million years for AMD to catch up
Isaiah Mitchell
The tsunami is just part of the truth. Demand also went significantly down thanks to SSDs.
Carson Smith
They won't actually be able to go down to 1nm, before that they'll have problems with quantum tunneling. We'll see 10nm chips eventually, and probably even 7nm, but anything below that would be pretty much impossible to use, because electrons would just randomly tunnel through into other lanes and fuck everything up thanks to quantum mechanics.
After 7nm, they'll need to start thinking about new architectures, different materials, and more cores to get any improvements in CPU power. I predict that Moore's Law will completely collapse upon the release of 7nm CPUs; progress will essentially cease until loads of money are sunk into R&D and it would take years to actually bring any product to market. If AMD manages to stay in business until then, they'll probably end up catching up to Intel. There will probably be a 7nm equivalent to the 2500k which lasts a fucking decade or more due to stagnation.
Jaxson Garcia
GloFo is skipping 10nm and going straight to 7nm. So 7nm might be sooner than we think.
Jeremiah Ramirez
It'd be interesting to see how processing speed stagnation will affect software devs.
They'll finally be forced to optimise their shitty code again.
Also the mobile industry still has a lot of catching up to do so the it'll probably boom greatly.
Zachary Bennett
The right still looks shit.
Ian Watson
I'll never understand posts like this.
You think the high poly 3D render on the right, with dynamic lighting, ambient occlusion, tessellation and anti-aliasing, when compared with a 2 dimensional image rendered in the 90's comprising of about 800x600 pixels, "looks like shit"?
Parker Long
spec dont mean shit in real life. its like how the iphne still BTFO every android device with its "inferior hardware"
Jose Gray
Well most computing these days involves networking and they found that the single slowest link in the process was the conversion of optical signals into electronic signals at the network edge. So they decided why not just make a computer out of photonic logic gates so the entire network + computer was just based on optical signals. The only real advantage you'll see is that you can play counterstrike online at 0.001ms latency but that's about it.
I'll try and summarize the three types of computing by explaining how a basic transistor would work in each:
>Electronic: The control bit can activate or de-activate the gate by it's electrostatic field either depleting or replenishing the channel between source and gate with carrier electrons.
>Photonic: The source beam and the control beam interact in a Non-linear optic medium. Depending on the presence or absence of the control beam, a signal beam may be generated at a higher harmonic and then down-converted so it can be fed into the next gate as either a control or source
>Quantum: If we have two qbits then we need to find an interaction between them such that the state of qbit 1 will influence the state of qbit 2 as it evolves in time. So our gate needs to put qbit 2 into a state where it will evolve at a different rate depending on the sate of qbit 1. Then we catch qbit 2 after it has evolved for a certain amount of time and it will have a particular correlation with qbit 1. For example, in research so far, a typical gate has been achieved with either the exchange interaction or the dipole-dipole interaction and manipulating the spin is done with radio-frequency pulses.
Evan Sanders
quantum computing seems very interesting then, thanks for taking the time to type that up for us user
James Morales
...
Landon Baker
the R&D going into alternatives to silicone is growing, by the time 7nm will be commercially available I think we will have a silicone alternative I doubt that Moore's law will collapse, giants like Intel are already pouring money into R&D for alternatives
Daniel Allen
That curve for India is suspiciously smooth. The double plateau for Japan is also strange.
Aiden Sanders
Don't reply to bait
Asher Perez
>can play counterstrike online at 0.001ms latency Nah, the speed of light isn't going to get any faster.
Owen Robinson
It's not 800×600 its 320x200
Camden Edwards
>the population physically wouldn't keep up To some extent we have already reached the point where large parts of the population cannot keep up.
The segment that is the most noticeable is the segment with politicians and the chattering classes. You know, those who want to control the people and the future.
Jason Torres
Again, stop regurgitating terms you know literally nothing about. There is no issue with electron tunneling in an insulated device. 5nm and 3nm GAAs are not an issue.
You're a popsci meme spewing redditor.
>typing silicone >twice
Blake Howard
Glofo, Samsung and IBM collaboratively made a working 7nm chip 5 years ago. Which back then they estimated commercial viability in 2018. Uses a germanium silica alloy.
David Scott
>Intel probably has that Unlikely. Recent history (like the x64) shows they have fossilised in their niche: ancient architecture propped up with extreme semiconductor technology.
I never saw a single publication from Intel on superconductivity. Hypres already makes chips, Intel is not moving out of semiconductors.
Levi Lewis
Its called SiGe, silicon-germanium, and its widely used in everything. Every 28nm mobile chip on the market has SiGe insulation in it.
IBM's 7nm process uses a SiGe channel which is one of the only notable uses of it. They announced production of functioning chips on the process only last year.
>Sup Forums really is this clueless
Jeremiah Wright
People are unwilling to buy a new PC every 3 years so I'm sure that won't happen.
The problem is starting with 2006 there was a gradual decrease in usefulness of going smaller the gains just aren't there anymore and it costs a shitload...
>make an ass of yourself >try to behave like a smartass when called out >link an ars article in true pleb fashion
Pottery in motion
Juan Carter
this. one example, companies that have moved their shitty websites made in, say, ruby+rails, to performant languages (golang for example) have seen their server usage reduced to 1/8.
also this. why are we even using java in phones? there are many safe languages already, and java on top of linux is... basically a big, slow wrapper
John Diaz
Sure. Analogue electronics typically use even larger dimensions.
Christian Bell
I think we won't have to go down to those sizes that are hard to engineer due to quantum mechanics in the foreseeable future. I think instead switching to a different strategy like other semiconductors than silicon, or eventually switching to optical instead of electrical processing units, is a better option. That way, even if processors keep the same number of transistors they'll still be way faster and more efficient because we can increase clock speed compared to silicon CMOS
Robert Gomez
As far as I know, QC is only useful for certain very specialized problems. Here's an analogy: QCs are to CCs as injection molding machines are to lathes or mills. They can be much faster, but are far less flexible.