It's over. Computer's are finished. No more computational progression. Only 5 years left of progress. >b-but quantum computer No they're useless for anything other than encryption and quantum physics simulation.
The bottleneck is now on the shoulders of programmers.
The future of computational improvement lies in massive parallelisation. It's the fault of lazy programmers who are still thinking in a single-threaded way that runs everything on top of a fucking JIT VM.
Christian Flores
very healthy
Juan Allen
>ditch silicon >processors can now go above 100ghz
Kevin Howard
>transistor size determines performance of processors >3D processors will never be made
Kayden Peterson
>No more computational progression. Why don't you kill yourself?
Christian Taylor
It's scary that there isn't a real alternative.
No such thing
Only increase in physical volume, that's not advancements.
Show ONE thing that can continue this trend.
And boys, even if we find a new material it wont' last long, perhaps a decade before it reaches atomic level again and then we'll be completely finished.
Have fun having the same computers forever.
Carson Anderson
Huh, guess it's time to go back to banging rocks together then.
Carson Williams
designers thrive with limitations.
faster we get to 5nm better shit we get.
Jonathan Ross
>2021 arrives >get a ThinkPad workstation >never have to buy a PC again
Noah Mitchell
>Matrix-like level simulation will never be achieved
I'll get me some rope.
Oliver Rodriguez
Almost nailed it, but there is still some stuff we haven't tried. Mainly novel processor and memory typologies, like processor fabrics that can access adjacent core's memory, but not any sort of global memory. But yeah, the Moore had a good run, but it's pretty much over.
Too expensive to use other types of semiconductors.
Although this could help with the issue that we can't move information across a modern chip in a single clock because of the speed of light being an issue, we have a hard enough time cooling mostly 2D architectures. 3D will only get us a few layers. Besides, do you have any idea how many layers are on even 2D processors? Like 30.
Butthurt? Complain to the fucking UNIVERSE about PHYSICS.
It's like discovering photons and believing we can travel at the speed of light.
Jonathan Ramirez
NIGGER
Charles Cox
Plenty of room for improvement at least.
We can all go back to writing C instead of fucking java.
Silver lining?
Hunter Walker
just read the article. The most retarded article I've ever read. You can even see it in the comments.
Gavin Foster
Quantum Computers.
The use cases you are describing are but the tip of the iceberg in the future. If it can run calculations, it can do anything.
Parker Wright
never read prajeet techblog comments unless you need to lose your mind with a quickness. or read them crying about I AM NOT GOOD WITH COMPUTER I WANT TO TOUCH TITY SO BETIFEL SUCC ME from thirsty ass indians and pakis.
Jack Barnes
>typologies >making a typo that is literally 'typo' pure pottery
Hudson Lee
Ray Kurzweil has said that Silicon is the 5th paradigm and there will be a few more paradigms until the singularity. I think we have atleast another 10 years of solid progress for silicone before we hit a definite wall. We dont have to keep shrinking transistors to have progress. Theres other techniques that could be utlized once we hit 5 nm.
Most of the promising long term replacements for Silicon unfortunatrly use expensive or out of reach materials like Graphene or Carbon Nanotubes.
But once we are able to create graphene and CNTs in sufficient yields and quality, we will have better computer that will be thousands of times better than silicone chips.
Cameron Cruz
>Ray Kurzweil has said opinion discarded
David Lee
Hur durr edgy
Lucas Moore
>AI meme won't happen
It's been nice daydreaming about it, lads.
Parker Miller
All x86 CPUs newer than Core2 and Bulldozer are botnet anyway
Austin Nguyen
k
Jacob Lopez
Does this mean my t420 will last me the rest of my life? I'm okay with this
Michael Stewart
>It's the fault of lazy programmers who are still thinking in a single-threaded way As long as later operations need results from previous operations, that's always how it's going to be. It's not necessarily the programmer's fault.
Carter Moore
Look, fucktards. You know that thing that holds your ears apart? It's a fucking >computer And it stomps your Stinkpad in the ground. Why? >Parallelization Hundreds of billions of neurons. We've barely even scratched the surface of what is computationally achievable.
in theory yes, but we reached physical limitation, sry.
Brandon Parker
>10 nm – 2016 shiggy >7 nm – ~2018 diggy
Joseph Thomas
>If it can run calculations, it can do anything. the height of stupidity quantum computers require quantum algorithms, which are not always going to be faster to compute than normal algorithms the most computational intensive task most computers have to perform is video playback, and sometimes rendering graphics for video games
we might see quantum co-processors at best if there are any worthwhile quantum algorithms in those two areas
Blake Anderson
Video playback is literally nothing compared to rendering vidya graphics.
Easton Smith
I didn't say it was, just noting that those two areas are the most computationally expensive ones. Hence why you see a lot of multimedia instructions being added to CPUs time and time again.
Jackson Sanchez
Honestly it just seems they make everything now bloated when technology gets faster. Im taking about programs and Web pages.
Imagine if they used something that required as much memory/cpu as windblows xp or dial up Internet pages? We could do things crazy fast.
But no let's make an os that requires 4gig of ram and 500mb sized webpages
Christopher Peterson
Umm. Trinary? Also, why the fuck don't we have it yet.
Quantum computers/chips will be amazing for AI, 'nuff said.
Thomas Myers
Oh yes, I need to run a highly sophisticated fucking AI on my laptop that I use for watching YouTube videos and pornography.
Austin Harris
honest question here
why dont they make the CPU just bigger?
Gabriel Nelson
bigger = hotter eventually though, once we hit the transistor size limit, we will be making bigger processors
Brayden Hughes
If we can pack down what we could and more for 22nm in 14nm let's say, why don't we pack even more than what we could fit in 14nm in 22nm if that makes sense?
Julian Jones
You don't need to, but at some point in the not-distant-enough future it's going to get shoved down your throat anyway.
Levi Allen
Indeed. you have have pre watch things and give it a rating based on your preferences, vastly improving your masturbation sessions by filtering out click bait.
William Myers
>Indeed. you have have pre watch things and give it a rating based on your preferences YouTube already does this.
Isaac Rogers
...
Dylan Martinez
Diamond and graphene processors are already well into development, so no not only is it not the end of computational progress dare I say its the beginning of monumental leaps and bounds.
in the meantime there's actually still quite alot of architectural tweaks and designs left in silicon so do not worry you'll have more fanboy bullshit to spew in the meantime i assure you.
But yeah the easy path of 'just shrink the die again' is coming to a close. And it's about damned time.
Luis Rodriguez
I don't understand what you're trying to say.
Logan Roberts
>And it's about damned time Yes, I am also tired of CPUs just werking, I want them to fuck the architecture up.
Aiden Gray
IF WE CAN PACK DOWN LET'S SAY A BILLION TRANSISTORS IN 22NM AND 1.5 BILLION IN 14NM THEN WHY DON'T WE PACK 2 BILLION IN 22NM
Leo Long
Good. It's not quite breaking free from the shackles of technology, but it's better than blindly Moore's Law-ing toward all-powerful AI gods controlling every aspect of human society, assuming they don't just wipe us out like bugs.
Carson Phillips
More like Graphmeme
It's the hottest shit in STEM and no one can figure out how to use it.
Owen Williams
okay so you are fucking retarded, thank you for clearing this up
quick question: what do you think those nanometer measurements represent?
Liam Butler
If you make them larger, you will hit some issues about the speed of light not being able to travel through the all CPU in on clock cycle Information is not transferred instantaneously
Noah Ross
calling bullshit on this one
Parker Miller
Quantum non-deterministic computers aren't any more capable than deterministic computers, they just solve certain problems faster. At most it will be akin to a GPU. Never will you run general purpose computing on a quantum machine.
Distance between identical components on the fabricated chip
Parker Thomas
>long-run investment in R&D for one specific technology has declining marginal rate of return who'da thunkit?
Jordan Rivera
Because a bigger processor is more expensive to manufacture and has lower IPC which matters to those who play severely CPU crippled vydias that can only run on 1 CPU thread efficiently.
We actually already have a 24 physical core Xeon processor that can run at 2.2 GHz when under maximum load. Has about 3X the performance as a desktop i7 with only a TDP of 165W. Did I mention it costs $7,000+?
However like I mentioned these fuckers are god dam expensive. So much that those building servers or supercomputers opt for cheaper 8-core Xeons to put in parallel.
Nathaniel Gray
>they're useless for anything other than encryption and quantum physics >>hurr durr i can read stuff of which i barely understand anything and then start shitposting about it leaving the impression i know what the fuck i'm saying >>t. Dr. summerfag, future mcdonald employee of the month
Cameron Davis
>Biological_computing yea but it doesnt matter if it arrives at its destination according to our frame of reference as long as its time happens relatively the same in its own space-time, and thats not even accounting for mass or gravity
Kevin Richardson
in case you haven't googled it yet, nm measurements within the context of processors is the size of the transistors
Owen White
Although I disagree, I don't feel the need to argue and only came into this thread to examine the breasts of that woman and blog about it.
Thank you and no thank you at the same time.
Ethan Collins
Light travels at 3.0e8[m/s] IN VACUUM
Let's say you are running a 4GHz machine.
1/(4e9 Hz) = 0.25e-9 s
0.25e-9[s] * 3.0e8[m/s] = 0.075[m]
That means at most your in vacuum CPU can move a information 7.5[cm] per tick. Now, CPUs are built in silicon, and such light propagates slower. Also traces don't go the way the crow flies. These are back of the envelope calculations, but show that die sizes are and clock speeds are truly limited by how fast light travels.
It's actually worse than user said. Presently modern CPUs take more than a single clock tick to move data long distances. Optical buses may help alleviate that a tiny bit, but there would have to be substantial room for optical couplers and decouplers. Moreover, optical buses are only useful for long distance buses, not short ones.
Logan Johnson
blow me edgelord...it is about damned time. best work is always done with a difficult dilemma.
this will do nothing but spur advancements as such things usually tend to do. This is not the first time jackasses were wailing about how the cpu cannot progress any further no matter what. This shit happened like 1/2 a dozen times already and every time a way to overcome has been found and implemented.
Now we have the opportunity to explore new things. things are going to get really fucking exciting in the next 5-10 years. You wont be disappointed.
Except for your failure to troll...that will always disappoint.
William Rogers
>parallelization in C god damn i might just wanna kill myself from reading that phrase alone handling mutexes/threads in C is a total fucking nightmare
Jeremiah Reed
Is there any reason we cant just make processors bigger?
Ian Wright
Yes, if you could read the fucking thread.
William Barnes
lol wut. it's the only sane language to do that stuff in.
Andrew Murphy
A brain is much different than a conventional computer. To give you an idea, a conventional computer requires memory (in which instructions and stored data lies) and a processor (an set of transistors organized as logic gates, designed to be controlled by a certain instruction set) which does the computation. Whereas in a brain, the neurons do both the memory and the calculation simultaneously. Imagine having no HDD and your processor doing all the work without memory or instructions, and you have a brain.
Robert Reyes
care to expand a bit on it? only did mutex/threads in java, though i can't see why it'd be such a difference in C
Justin James
It's a fucking miracle with the huge ass instruction sets we have now that processors work as well as they do. I have no doubt that we will continue to make computational progress, but switching from transistor shrinking to architecture overhauls is going to be extremely painful.
Dylan Collins
>read thread >you're just getting butthurt and calling people stupid over and over
Real fuckin educational. Go jump off a building, faggot.
Bentley Clark
see Nigga, do you want to pay 7K+ for a fucking processor?
Landon Nguyen
>handling mutexes/threads in C is a total fucking nightmare You literally lock and unlock a fucking thing, how dumb are you?
Jayden Nelson
literally explained 3 posts up from your stupid image macro you fucking faggot, kill yourself
Nathan Cruz
Why dont these idiots just add more cores and makes the CPUs bigger
ffs
Ryder Peterson
please don't bait with the others.
Nathaniel Adams
if by sane you mean it's gives better performance than things like java, then yes
it's not too much different but on top of all the other errors you can run into that the language manages for you in something like Java it can be very difficult
any program getting reasonable parallelization will generally spin off more than 2 or 3 fucking threads, managing them properly so they don't deadlock can be nontrival in any program solving a real problem
Owen Collins
>muh triple threads! So how the fuck do other languages make it any better, besides adding a bunch of retarded constraints that you can add with C?
Austin Thomas
i usually use c# for this kind of stuff so I guess my argument boils down to the fact that it's more built in to the language than it is with something like C
Christian Clark
use an existing thread pool implementation like glib/ck/openmp has, you don't have to touch that shit just because it's C
Jeremiah Wright
Nigga you trolling or what?
NM means the size of the gate grid. Each gate grid can fit 4x4 transistors in it.
Common knowledge
Wyatt Phillips
>Only increase in physical volume, that's not advancements. AFAIU, advancements stopped a while ago already. the latest processors are not much better than those made in ~2008-2010... except they use more power to get more computational power, and minor improvements in the designs to get a bit more juice
Jason Flores
Is the singularity a meme?
William Hill
Why we can't make cpus bigger?
Samuel Wright
>run AI on quantum computer >AI replaces "lazy programmers" for the stuff that has to use traditional algorithms
>55904372
Jaxon Jenkins
>Too expensive to use other types of semiconductors.
what if everyone gets a dumb screen with wifi and access huge mainframes?
Alexander Ortiz
>tfw God is watching over us
Luke Sanchez
shhhh
They're listening. That's probably what'll happen in the future.
Elijah Stewart
who is this semon demon?
Benjamin Campbell
no, it is not.
People saying it's going to happen within a hundred years are fucking kidding themselves though.
Sebastian Lee
This. Google returns a dead-end.
Brody Green
I want to impregnate those tits
Also, Non-Si transistors
Dylan Diaz
shut up
Brayden Robinson
>hundreds of billions
Thomas Williams
It does have some really impressive properties and I'm still hoping we find a way to use it and mass produce it efficiently and cheaply in the near future.