Mfw my UNIRONICALLY 6 year old cpu is only 30% slower then a BRAND new cpu that cost the same as mine did 6 years ago...

mfw my UNIRONICALLY 6 year old cpu is only 30% slower then a BRAND new cpu that cost the same as mine did 6 years ago.....................

what the fuck happened to cpu development?

source: cpu.userbenchmark.com/Compare/Intel-Core-i5-7600K-vs-Intel-Core-i5-2500K/3885vs619

Other urls found in this thread:

cpu.userbenchmark.com/Compare/Intel-Pentium-4-340GHz-vs-Intel-Core-i5-2500K/m293vs619
twitter.com/NSFWRedditImage

fucking pleb. current technology is gonna plateau without further innovation.

>mfw my UNIRONICALLY 6 year old cpu

Explain to me what that means or kill yourself.

that my cpu is 6 years old

excuse englsh not first language

What's an ironically 6 year old cpu?

But how can it "UNIRONICALLY" be 6 years old. What does that mean?

i have an phenom 2 965 and i still can play all modern games FOR FREE without lag

jesus christ stop being so DOMB! even a 4th grader could solve this

Kill yourself, you underage newfag. You don't belong here.

who is this semen demon?

>cunnyposter being retarded
As expected of a fa/tv/ulture.

Technology made a big jump in processing in the 1990s. From there they continued to add more cores and try to fit more and more on the chips. Now they've pretty much reached capacity.

The real answer is that big business probably has the technology to move forward, or refuses to fund it because it's much more profitable to release something 10% better every year, rather than make everything obsolete.

There's also the factor that a big enough jump in processing power like say moving to graphene processor that's hundreds of times faster than current gen, that current encryption methods would fall apart and government secrets would be exposed. Remember that most technologies start as military.

>underage
>mfw im 34

also please discuss cpu development not offtopic things

this man is right
Moore's law does not stand today.
If you want actual technological innovation you're going to have to wait somewhere between several decades and possibly centuries.

i have an ironically 6 year old cpu ask me anything

but be warned i will answer ironically due to hardware limitations

this makes sense, also I blame consoles.

How much cock does OP suck on a daily basis?

Pretty sure he meant "literally"

as in it's actually 6 years old and he's not just pulling that number out of his ass

I've got the same CPU, the other week I had to fucking replace the heatsink compound because my temps were in the 90s and it was hitching games, now it's back down to 60 at load. My graphics card is really starting to show its age though, I might wind up picking up a 480.

>34
>Barely able to articulate yourself.

Sad.

like,, maybe 5, or 6 right now. my dude

It's time to go back cunnyposter

op here, I upgraded my ram to 16 gig and bought a gtx 970 awhile back, I can run all modern games at high or better.

mfw I can probably keep this pc for another 3 years without problems

>smug Riri

sad is such a useful way of insulting somebody. Thank you for this gift, POTUS trump

sandybridge was a big step forward, every other release has pretty much just been die size

and amd being consistently mediocre doesn't help

you know who this is. Tell me a source

No reason to blame consoles, they receive all of the same benefits of processor technology. Consoles are just cheaper computers optimized around their given hardware.

Moore's law still stands, but it's reaching the boundaries of what is physically possible.

You can only make transistors so small.

Tell me, how can something be "ironically" 6 years old?

>someone does (or will) actually believe this
why must life be such suffering

I hear it is mostly just heat issues. As in, they could make CPUs as good as they ever can, but they would just overheat and melt. Sot hey have to work around that, resulting in slower pace of development.

Once they move to Graphene or something else, we'll see big improvements.

problem?

who /2600K/ here?

2600k, 4.8GHz here.

>desktop CPUs
There's no point in making them any more powerful. My 8 y/o one is enough to run DOOM at 60fps on high.
It heats significantly more than a newer model would be, sure, but it's almost just as good as performance is concerned.

Intel stopped making shitty processors like Pentium 4.

There are no more quantum leaps to newer, better processors.

fuck off dan

Why not just make bigger CPUs? Do I have to think of everything?

that doesn't solve any problem

A friend said that Moore's law will be a thing for quite a while longer. Because even when you can't make transistors smaller, you can start to pack them horizontal.

Not that I know how any of this shit actually works. So it might all just be BS.

No competition to make huge improvements. Also when you think about current cpu architectures it becomes pretty obvious that we're at the end of an era. 14 nanometers son.

a bigger CPU is necessarily a slower CPU. Electricity doesn't travel instantaneously.

I don't have a boner

The problem isn't making double the transistors for the same price
The problem is heat dissipation, plain and simple
Don't you wonder why they haven't really increased CPU clock frequencies in the last 15 years or so?

hi i just built a 2000$ meme PC and i "upgrade" every year because goyvidia and inteljew tell me to and im going to have to ask you to DELETE THIS

Who /ironicCPU/ here?

speed of light

If I fap to this, will I get jail time?

im going to unironically upgrade to a $3000 dollar meme pc because someone on Sup Forums said you need the newest i7 for games
fortunately i still have my ironic 2500k

30% is a huge difference though

Yes.

maybe if it was backwards compatable with ddr3 and other chip sockets.
too bad it requires assembling a new motherboard and ram kit.

calling the FBI

Get this hothead outta here.

AMD stopped making CPUs for 5 years, so Intel had no reason to improve

>calling the FBI
>not CIA
Go home, gamer girl.

...

why not fap to her at 18 instead?

>Why not fap to her uglier version?
nah

>TELL 7 YEAR OLD GIRLS TO STOP SUCKING DICK SO SEXY
mad fucking thad the legend
only thing he did wrong was having jewbook in the first place

who's that 40 yo cocaine addicted pornstar made of plastic?

I like underage grills as much as the next guy, but grills are usually the hottest at like 15 to 22 or something
Anything younger and she just looks like a kid

consoles

>muh console boogeyman
faggot

We know it's you, Dan. You don't have to pretend here.

I'd totally fug all those Disney teens if I was him
I'm not a footfag though and I'm pretty sure he likes them younger than just teenagers

dont blame consoles.
blame amd and GloFo.

Why would intel release a better product in these 6 years considering bulldozer and its derivatives were a major failure?

Im pretty sure you can find more than 60% increase in arm cpus. There exists competition, and it drives development.

>Why not fap to her at an arbitrary age?
But yeah. She fine.
Thanks, user. When the FBI knocks my door I'll show them I have nothing to hide.

>tfw you bought an i7 920 in 2008

>Two Thousand and 8

>Literally nearly a decade ago

>8 threads, 4 real cores, OC'd to 4.2ghz

>paired it with a new GTX 1080 8GB
>only like 5fps at worst below Guru3D benches

>runs all my programs and rendering and emulators like a dream still

aside from this old mobo not supports DDR4, I'm literally ahead so fucking much. Literally best CPU intel has ever released, REAL TALK.

nodev here.
Would Sup Forums play a game called "Cervix Grinder"?
5 USD retail price to try to set it apart from the usual shovelware, but almost omnipresent 75% discount to make it affordable without much of a wallet hit.

Me, I have an fx8150.

Is it bad? I own a 8350 at 4.4Ghz and very few games drop below 60

fx8320 here
everything's still fine desu

consoles

>not using an OC 2600k at 1440p
>not having the build with the most longevity of all time
>upgrading to shit with LEDS out the ass for no reason

>6 years
>30%
>this is bad apparently

GPUs are over 300% faster than 6 years ago, yeah, it's bad

Yes, it's pathetic.

Here is a comparison for a 12 year old and a 6 year old CPU in similar consumer bracket - so the same timeframe

cpu.userbenchmark.com/Compare/Intel-Pentium-4-340GHz-vs-Intel-Core-i5-2500K/m293vs619

I don't know actually, I'm still using a 660ti so I'm gpu limited in most newer games. I still fell for the bulldozer meme though.

You are misunderstanding Moores law. Hint: it's not about speed.

>tfw 1440p korean for years now
>tfw 4670k 4.5ghz for years now
>both of them actually cost more than they did when I bought them

ok dumb question
how does CPU power affect video games?

well, at least there has been a major improvement in the mobile intel cpus

>my almost 1 year old CPU is about 35% slower than a 6.5 year old i7
thanks ayymd

The CPU sends instructions to the GPU, which means if your CPU sucks it will bottleneck your performance. The CPU usually handles physics calculations (although many are offloaded to the GPU nowadays, Crysis was famous for the fact that the explosions were done mostly by the GPU) and AI. This is why simulation games require good CPUs to run, the more realistic the physics the better your CPU needs to be. Also the better CPU and memory you have the better draw distance and level of detail you have.

Intel has no competition so no reason to push themselves.

i reckon we've stagnated after 2010
1st gen i core to 2nd gen was a big jump though, 1st gen are irrelevant today but sandy bridge is still going strong

Money shifted away from desktop due to smartphones rising and that's why development and progress at desktop CPUs stagnated. Welcome to capitalism.

AMD being incapable of fighting Intel thanks to the Bulldozer Catastrophe, resulting in Intel taking the CPU market throne and getting ~5 years to slack off until AMD could afford making a new architecture, which is Ryzen.

current encryption methods would take billions of years to brute force, i think they're safe for a little while.

GPUs have different structures and functions though. And are also boosted by constant innovation in rendering technologies since optimization work is near endless and only limited by the amount of work you can put in to it.

And GPU development today is just one step behind the latest CPU and performance locked by how many auxiliary processors they want to smash on to a circuit board. So its not really a development like if all you do to increase the power of your car is to rig more engines to its system.

my backup computer(porn computer) still uses a phenom 2 965 and the only thing that taxes it is vr.

The samne thing that happened in Obama's 8 years of presidency.

Stagnation.

Is that really her? Coke is one hell of a drug.

yes, here she is in a more natural look

You can only make a connection about 5 atoms across, anything more and the electrons start to jump out of the circuit and cause errors. Processing speed will not increase like it has before because there is a limit to how small things can become.

Intel stopped going for raw computation power, because they didn't have any real competition in that area. Instead, they invested in power saving. They already had the high-end gaming market covered, nothing left to gain from pouring billions into research, time to solidify the market that really matters: Business. Servers and workstations.

Great but not video games

We're still quite a bit away from that, though. Sandy bridge was 32nm, Coffee Lake onwards will be 10nm, and Intel thinks 5nm are doable as well.

>30% slower
>he doesn't overclock K CPUs
lmao