Where did it all go wrong

look at peoples predictions for what processor technology should be like by 2011
geek.com/chips/intel-predicts-10ghz-chips-by-2011-564808/

>if 10 ghz is the best that intel can do by 2011, amd or somebody else is going to eat their lunch. intel better pick up the pace

>we won't even be runing silicon chips at all by then. maybe we'll all be running dna computers, where mhz and ghz and thz are irrelevant. just think. what kind of computer were you running in 1989? i never dreamed of computers like the one i have (which is low-end, by today's standards) back in 1989.

> by 2011, it will be necessary to have expansion cards (i.e. graphics cards, sound cards, or whatever) at all. the future computer i envision ... would be a box, with ports, that could be configured to do any number of functions ... added just-in-time…

>by 2011 we will have implementation of quantum processing that will make the xhz debate look like the colonists debating over sucession from the uk.

this is sad, where did we go wrong?

She just posted that pic

Do you stalk her?

shes my wife idiot
now please talk about cpus

who is she (male)

intel shanking AMD 2 years into their 10 year contract

Lack of investment. Same goes with the federal government. Now is the time to issue bonds and build heavy national infrastructure.

I want to lick her feet

She looks cute!

tell me who this is right now.

you idiots are fooled by makeup

this is an 8/10 dressed up

No competition happened.
Tho 10ghz is pushing it.

But why invest in anything when you're number 1 since no one else can go up? (And everyone is going down)

thats just what she looks like r*tard

Where a calculator on the ENIAC is equipped with 19,000 vacuum tubes and weighs 30 tons, computers in the future may have only 1,000 vacuum tubes and perhaps only weigh 1.5 tons.
- Popular Mechanics, March 1949.

>r*tard
you have to go back

...

back in grad school i worked on computing with light and transistors that had ten states (0-9 or base 10) rather than 2 (0 and 1 or binary). anybody who doesn't think that these types of technology won't be commercially available by 2011 is kidding themselves. in addition, new os capability to scale up and out will radically change how we compute. maybe clock speeds will only be 10 ghz by then but dozens and dozens of processors may coexist on a single chip that process data in base 10 (or hex) instead of base 2, effectively performing hugely more complex computations with fewer transistors and (relatively) lower clock speeds than would currently be needed. i have seen the future and it rocks!!…

jews

>i have seen the future and it rocks!!…
and everything went to shit thanks to the jews like cuckerberg and datamining

who is this semen demon?

Liyu0109 on twitter

>Liyu0109
bless your soul, user.
my cock is eternally grateful

>moors law
>pretty much law

anyone with half a brain knows that that kind of progress is not sustainable. maybe the first few jumps but there are diminishing returns after a few of the cycles. I guess the internet wasn't as informative as it is now.

This guy was actually on to something
>short sighted (5:32pm est wed jul 26 2000)
>i think that thinking about where the desktop pc is going to be in ten years is a little short-sighted. we're already moving away from the desktop with all manner of specialty appliances that focus on one area of computing. from game consoles to internet appliances to cell phones with email, we're getting away from the all-in-one computing unit. i, for one, doubt that in 10 years computing will still revolve around a single central processing unit and function as we now know.

The mighty desktop is still useful and popular but has to compete with tablets, phones, a few game consoles, smart TV's and other devices.

The market shifted towards normie social computing, then stagnated as their needs were met.

>>moors law
The Moors were driven out of Spain during the Reconquista. Their laws haven't applied for a very long time.

but they were right? my CPU is 18 x 2.8 GHz = 50.4 GHz

>where did we go wrong?

In predicting ridiculous shit like having DNA computers in a 10 year time frame while having absolutely no starting point for the technology outside of a concept in some sci-fi geek's head.

Predictions for the future are almost always wrong and when they're right it's usually only in a vague sense.

That's not how that works.

I'm still waiting for a better battery tech that doesn't rely on lithium ions for years and there's still no progress on making it consumer ready.

for software that scales linearly with core count and clock speed, it pretty much is, but obviously i was trolling

How did you know that she just posted this pic?

really makes me think...