When will Intel/amd break the 10 GHz barrier base clock on consumer grade CPUs?
I'm predicting 2022
When will Intel/amd break the 10 GHz barrier base clock on consumer grade CPUs?
I'm predicting 2022
Other urls found in this thread:
Probably a few years after we move from silicon.
woah you gave them a whopping 5 years to triple that shit
Not in our lifetime. No reason to.
>No reason to
>Intel cucks insult AMD for trying
Same time we get Planck-scale transistors
Never. It's the physical limit of silicon based computer engineering.
Couple years for nitrogen overclocks to get there.
Can't we just make the chip bigger, like the Xeon chips?
They did clock 7GHz last week. Maybe 2020.
We've been stuck at ~4ghz stock consumer speeds for what, 10 years now?
Not changing any time soon. Enjoy your 128 cores that no single program uses anyways.
I predict that Netburst will achieve 10GHz by 2005.
>128 cores
Ha! We've been stuck at 4 cores for what, 8 years now?
Enjoy your 4GHz quad cores with marginal efficiency improvement.
You were right. The 2005 pentium series was very much able to overclock to 10ghz on liquid helium.
Too bad it was shit.
Clock speed is dependent on the speed of light over the area of the chip and heat dispersion of the chip. A bigger chip, while allowing more heat dispersion forces electrons to travel longer distances. So therefore no clock speed increase.
That still won't help. The only thing that will help is when computer engineers find a way to dissipate the kind of energy running through a 10+ghz computer. Increasing the surface area of a silicon chip does not allow them to move that heat as fast as effectively as required.
no ghz
no cores
128gb onboard l1 cache only
Enjoy your $7000 CPU.
Never with silicon. Heat increase is exponential.
We'll need a better material for that. Graphene is expected to hit somewhere around 500 - 1000 GHz once mastered, but right now, it's nothing more than a meme.
Why not completely mix things up and just replaces CPUs with FPGAs? Instead of running software, your applications will be implemented in hardware via FPGA for maximum performance.
>Planck-scale transistors
>Planck time cycles
>5.39x10^44 hz
Is there even a word for that?
Limited by the speed of light
Electrons cant travel through a large die in one "tick"
We could go back to having cache and shit off the die and putting it on the pcb to leave more room on the die, but then it'd be slow for other reasons
Whenever Ryzen gets released and liquid nitrogen OCers get their hands on one.
Did you know that the highest clocked CPU ever was a Bulldozer derivative that hasn't been matched yet? valid.x86.fr
Good thing electrical current isn't reliant on the speed of electrons. Electrons moving through a wire are pretty damn slow if you didn't know. It's the electric field that's propagating, the electron drift is kind of irrelevant.
Fiction
Nah, fiction would be creating a pocket dimension where time moves faster, doing the calculations in there, then importing them back into this universe.
Pretty sure Intel already patented that
What if we're the pocket dimension.
Too much Zen for one thread.
what if our entire existence is just a random fluctuation in space and time and everything could simply cease to exist at any instant through some cosmic dice toss