IF you can make it better, I would like to see it. Until then keep using your 2600K
IF you can make it better, I would like to see it. Until then keep using your 2600K
OK, listen.
Just, listen.
What if, hear me out.
What if they made CPUs bigger so you could fit more CPU into them?
Heating?
What if they used bigger fans for the bigger CPUs?
Speak for yourself. I'm happy about it. I'm a poorfag and my i7-4790K will last for 20 years.
electricity is not instant
if they were to make it bigger the issue would come from shit taking to long to get from one side to the other
that´s why you will not see cpus that are 4x as big
>at least not at reasonably high clockspeeds
>built computer 2 years ago and it's still near the top of the line
feels good mang
What if we built an electrical accelerator to speed up the CPU?
ingenious
we have those
they cost 6k for a reason
and no it's not 100% intel being jews, 50% of it is that making a big cpu that actually works is really hard.
That's what you get when you have a monopoly, OP.
Isn't capitalism great?
there's almost no reasons to upgrade from a 2xxx series CPU if you already have a functioning quad core
My CPU runs below 40ºC under full load on all cores, i think current cooing solutions could deal with a bit more heat without any problems
Couldn't we just send those electrons through fiber instead of copper? We do this with internet, too. Why can't we do it with electrons?
It's called fiber-OPTICS for a reason.
What reason? And why isn't it used in CPUs?
still not instant
>Optics is the branch of physics which involves the behaviour and properties of light
Like in light beams not in electrical pulses
>4790k
>poorfag
>Like in light beams not in electrical pulses
But we already use light beams to represent bits. CPUs work with bits, too. Right?
Does your mind comprehend, how a CPU works and how incredibly difficult it would be to calculate just with light?
>we have those; they cost 6k for a reason
Is it powered by steam? Is it japan's fault with their advanced that we can't have nice things?
found an article:
>Valve unveils 13 new Steam Machines, from $500 to $6,000
Wat
not if you have gigahertz
>>built computer 2 years ago and it's still near the top of the line
built computer in 2012 and it's still near the top of the line
That's not good that depressing ass fuck.
For what do you need computing power? Except of gaming, heavy simulations or solving math problems?
Its ONE year later you nigger fucking faggot
No there are 2 years difference.
No you stupid shit, skylake came out last year
to run the applications of today done in Node and webkit for UI
I am know. No way I'm giving 400 euros right now for a new chip. It would be only a 2% difference in most practical applications.
And you stupid faggots are still considering upgrading to *another* 4 core chip.
It came out in 2015 you stupid mong.
AUGUST OF 2015 IS 14 MONTHS AGO
NOT OVER 24 MONTHS AGO
MONG
AUGUST OF 2015 IS 2015 NOT 2016 YOU MONGOLOID!
480
YOU FUCKING MONG
Electrons move at a speed of around 20 inches/second
Yes. That's why we should use light, cause light is way faster
If you need some eggs for a recipe, is quicker to pop into your corner store, or drive halfway across town?
Is the quality of the eggs halfway across town better than the eggs in the corner store?
Fucking capitalist patriarchy shitlord computer industry drags our dicks through a mile of glass to upgrade everything every 6 mo, and then they get too far ahead of themselves too fast and decide to release minorly incremental bullshit. I WASN'T CONDITIONED FOR PATIENCE! I NEED SOMETHING NEW AND POWERFUL TO BUY NOW! SOMEBODY TAKE MY FUCKING MONEY!
I have an i7 4790K and it's not even worth upgrading 3 years in now. The only thing I could get is M.2 from upgrading, and I already am using a Samsung 850 Pro as my main OS/boot drive. Dropping another $1500 isn't worth 1 second faster load times... or is it?
Immaterial
That's not how it works. CPUs work faster by decreasing the distance that electrons need to move. By making larger CPUs, you decrease their speed, just like it takes longer to drive across town to get eggs.
So is progress slowing down?
Oh that makes it a lot easier.
As soon as you make a 14nm optical, transistor, you will be billionaire, until then, enjoy marginal CPU performance boosts.
What if we just optimized software better and stopped relying on an architecture from the 80s
>people mad that they don't have to spend money
All the good archs were invented in the 80s
Wooee you guys are dumb.
The reason is yields. Semiconductors are made on wafers containing many possible chips. How many depends on the size or the wafer and chip size. Only a fraction, sometimes very small, are perfect.
Imperfect chips can be salvaged through a process reffered to as binning based on their defect. Mostly just removing/disabling cores or reducing clockspeed and/or reducing voltage.
The bigger the chip, the less you get per wafer. The complex, the higher the chance of faulty chips. The less salvageable the imperfect chips are the less you can turn into low budget offerings.
Latency from size is hardly an issue. MCM's, or Multi Chip Modules, are packages containing 2+ desktop level chips on a single package. I know that AMD plans to release up to a 4 chip MCM for servers.
The other issue is heat as you're not increasing surface area significantly but you are increasing transistors and power usage by quite a fucking lot.
It's called photonics, and is an area under active research. If someone manages to figure out how to properly build transistors based on that tech, then microprocessors that generate no heat and can operate in extreme temperature climates may be possible, but a good solution hasn't been found (yet).
I imagine even a working optical transistor made on a retardedly large process (by comparison to today's state-of-the-art) would still be quite revolutionary in terms of the effect it might have on the industry.
Fucking mongoloids