Why has progress slowed so much?

Decent PC in 1996 vs 2006

CPU: Pentium 133Mhz vs Core 2 Duo 2.4 ghz (almost 20x the frequency increase, much more performance increase)
RAM size: 16 mb vs 1GB (x64 increase)
HDD size: 1GB vs 250GB (x250 increase)

Decent PC in 2006 vs 2016

CPU: Core 2 DUO 2.4 GHZ vs Core i5-6600 3.3 Ghz (much less performance increase than in the previous 10 years)
RAM size: 1GB vs 16 GB (only x16 increase)
HDD size: 250GB vs 4TB (only x16 increase)

Because jews and asian jews basically.

You don't even have to look that far back. I remember that in 1996 we had a Pentium Pro 200mhz and in 2000 we bought a Pentium III and the performance difference was night and day, but I'm using a 2500k and it's still fast as fuck even today. And not even only high-end desktop CPUs, I have a x220 Thinkpad too with an i5 and can still do everything bar running heavy professional software.

two basic factors:

we're really close to the limit of silicon right now, it's becoming harder and harder to cram more transistors into a single dye. moore's law is no moore. after the hard leap to 10nm we will be forced to abandon silicon in order to go lower.

lack of competition from AMD caused Intel to go all jew mode on us. no competition = no innovation. competition drives innovation.

IDK how many years the Intel sandy bridge family has left (sandy, ivy, haswell, broadwell, sky, kaby, coffee)

There was a presentation I can't find any more by a Polish programmer who drew parallels between his juvenile fascinations for airplanes and the more contemporary fascination for technology and Moore's Law. He remembered the Concorde, and back then imagined how the future airplanes would have been. It turns out that the airplane industry is one of the most expensive ones with the least profits, and that the Concorde isn't "efficient" enough. With other words, it doesn't matter if Paris-New york takes 8 hours or 4, it matters to keep them as cheap as possible. That's why concorde has been dismissed and why airplanes didn't change much in the last decades (besides some auto-pilot and radar shit).
Same applies to technology. First, the race to moar ghz halted, and it has been replaced with moar cores. Now, the moar cores race has almost halted now that the research is for x86_64 alternatives and less resource-hungry alternatives, even if this means less CPU instructions. We have reached a "good enough" level twice, and we're accomodating a different market.
Progress hasn't slowed at all, progress itself... can progress.

We are hitting limits on how much transistirs we can get on a chip.

Multi core cpus delayed this for about 15 years but even they only go so far.

It's just like lithium batteries, some new process/method will need to revolutionize the market to push things forward again.

Found in another thread:
>An increase in Semitism without a corresponding increase in countersemitism. Supply side has been steadily increasing its willingness to fuck over their customers. Demand side has been steadily decreasing its intelligence.

remeber that essay - some white cube hipster blog, kinda with pics and shit.

yes, arm and friends plus custom TPU can be the future, like your ultra linear algebra brain and some 1 watt networking and tcp connector

Single ALU paradigm begin on limit, only multiple massive ALU will grow, things like GPU.

Another point 2D circuits get limit on reduce size components need next revolution 3D layer circuites.

nice word salad

That shit applies to everything, you mongoloid.
>1-12 year old kid: grows normally
>13-20 year old: goes through puberty and grows even more
>21+: stops growing altogether
>50+: starts getting weaker
>65+: starts shrinking due to joints wearing away

from 2000 to 2009 the future looked bright but from 2010 to now it feels decline and stagnation on everything and even worst with this trend of PoC shit, you can't do anything

>t. What's diminishing returns??????

Diversity?

that's only a scam scheme, the rabbit hole goes more deep

The slowdown is great for consumers. Computers are useful for longer than ever. Manufacturers are just piling on useless gimmicks to justify new models.

Gee, it's almost as if exponential growth doesn't go on forever. Who'd've thought.

decent computer in 1986 vs 1996
cpu: 16 MHz vs 133Mhz (only 8x freq increase)
ram size: 2mb vs 16mb (8x increase)
conclusion: early 2000s were just special

I wonder how they priced a human brain in order to put it on a scale with prices.

no wars

The graph was made by Kurzweil, who is a fucking fraud (as a futurist, not as an engineer).

That's how technology advances.
2001 a Space Odyssey seemed like a realistic future in 1968 because we'd gone from flying a few meters to the Apollo missions in 60 years. People make the same mistake with computer technology and think the exponential increase of the last 60 years will continue forever.

So... are mice doing 10^10 calculations for $1000 a second? Cause I can catch a lot of mice if you need more computational power for much less than $1000 a second.

Show me accurate equivalents of binary computations to organic brains.

I don't think it's as simple as 1 transistor = 1 neuron.

human incompetence in general and material costs.
that's really the only real answer.

>explodes dead dinosaurs because too lazy to walk
>pokes sand with electrons because too lazy to do calculations.

I genuinely don't know whether these are signs of competence or incompetence. Which I guess puts me in the latter category but let's be honest, it's just laziness. "eh, it just works good enough, we'll figure it out later, if we can be arsed.

>2mb in 1986
more like 64k

But can I use them to mine for bitcoins?

it's not even that really. it's more just complex systems always devolve into some mess

>Cramming billions of transistors into a single square inch
>Human incompetence

*tips fedora*

limits inherent to 2D silicon based design

additionally even with literally infinite computational power, a single pajeet writing code can ruin everything

Sales and marketing essentially.
Early 2000's everyone started to need a computer, so the technology had to reach a point where it was usable for every day tasks and occasionally complex ones.
After the focus became less about "Let's make it feature rich" and more about "Let's make this run on grandma's computer that hasn't been updated in 13 years" therefore eliminating the need for heavier hitting hardware

Don't worry OP, once Quantum Computing and carbon nanotubes/graphite become commercially viable then we will enter a new super advancement in computing.

CPUs have slowed down to a halt.
In 1990 you had a 20MHz 386.
In 2000 you had a 1GHz Athlon for 50x the clockspeed and about 10x the computrons per clock cycle.