Have we come to a point where it's pointless to waste money on most computer components anymore for the next 8-10 years...

Have we come to a point where it's pointless to waste money on most computer components anymore for the next 8-10 years at least?

Until Graphene comes, and that may be 10 years, or 20 years from now, it's all worthless pseudo-core wankery.

CPU and GPU upgrade are pretty much worthless percentage increases for the money, paying the full price of your previous device for a ~5% incremental performance increase, and a buzzword meme feature used to hook the idiots which either won't see implementation for a long time (software developers need a looooong time to develop new engines and code to utilize) or is engineered to only run on one product to showcase stellar graphs as bait (Hitman was even an incomplete game at the time that certain graph for DX12 was idiotically posted).

Basically, 4 core CPUs and GF 600/700 series is what's gonna do their good work for the next decade.

RAM is maybe the only anomaly since browsers are coded like diarrhea with memory leaks and features missing, while websites are coded worse and worse each year, so this seems to be the new upgrade norm.

What do you think?

Other urls found in this thread:

medium.com/quantum-bits/so-when-can-i-play-video-games-on-my-quantum-computer-a8219fc75ffa#.udd8cykxj
medium.com/quantum-bits
twitter.com/SFWRedditImages

I think that my 6700k @4.7 with 4266mhz ddr4 and a gtx 1070 blows the fps I got on my ivy bridge and maxwell gtx 970 build completely out of the wayer.

i think technology has stagnated and yet somehow next years equipment will run newer games and other softwares better

Yup, Moore's Law no longer applies because we literally cannot make the circuitry any smaller than it already is. It's up to software devs to increase performance now.

Because DX10 and DX11 are finally seeing some proper code adaption, albeit not full utilization by a longshot.

please help me understand, i am stupid, exactly what do apis such as opengl and whatnot do that can make cpus/gpus perform so much better

No I don't I think the upgrades are worth it and you are just mad you can't afford them

and also why the fuck older cards cant take advantage of these differences

It's not what they can do, it's what developers need to do to utilize them and how well they need to do it.
API's are management software and management can always be improved. They minimize the time required for certain processes to communicate and do their thing, and those tiny changes stack up when you think in the amount of calculations that are usually done, kind of like in economics when in certain trades a seemingly small dollar or few of difference can stack up to hundreds of thousands over a period.
It's just that coders need to get their coding in order and create engines to accommodate.

CPUs/GPUs carry out extremely complex mathematical functions and algorithms. Those algorithms can be optimized to run faster and more efficiently. This is what APIs like OpenGL etc try to do.

who here 2500k/2600k master race?

It's slowed down, but it can be smaller.

I have one. I built my rig 5 years ago and it runs beautifully. My hd6950 has given me trouble.

I could use a RAM upgrade to 16gb and an SSD though

Are there any of the AMD CPU's that are on par, or at least close, to some good Intel counterpart in terms of temperatures and performance?
More specifically, any 6 cores that are worth getting over an Intel Quad?

It won't be graphmeme, but Intel is currently working on non-silicon CPUs as we've really plateaued on that front without just adding more connections and essentially more CPUs
Quantum computing sounds really nice and could easily multiply all current computing power but it's too tiny, too hard to view/manage and just too high of a goal right now. The idea of quantum computing (trinary or more instead of binary) could possibly be replicated but that would take a lot of work and more importantly a lot of collaboration and teamwork between all the hardware/software

>FUTUREPROOF

So what would be a "future proof" laptop? XPS 13/15? Xiaomeme? Not interested in gaymen or rendering

>Have we come to a point where it's pointless to waste money on most computer components anymore for the next 8-10 years at least?
the last generation of consoles ARE still being sold/produced.
they single handedly held the market back, while people who buy PCs held up the notion that these companies can release nothing but rebrands and they'll still buy it en masse.

we should've had stacked 12nm processors/gpus and memory as standards by now.

>Not interested in gaymen or rendering
then buy anything with the best battery life/cheapest cost...

>Quantum computing
Works only on tasks whose nature is random computation.
Simulation programs and scientific programs benefit from it,
but gayming won't see almost any benefits from it, especially 3D modeling and media editing and processing and such.
People don't overestimate the power of Quantum computing, but they do overestimate its uses which are very very niche.

>Have we come to a point where it's pointless to waste money on most computer components anymore for the next 8-10 years at least?

Consumer electronics are inherently "deflationary". What I mean is that from the moment they hit the market, they're becoming ever more obsolete.

Humans have a weird instinct that anything they buy is somehow eternally worth what they paid, or at least eternally valuable to others. In fact, the vast majority of things you purchase are -- for all intents and purposes -- trash, the moment you buy them.

Sure, if you're *really* smart about some particular market (e.g. video cards), and a savvy seller, you can flip your old stuff and at least reduce the bleed-out, but bleed you will.

Waiting for arbitrary future technology makes no sense. Consider your needs according to *your actual needs*. Is your computer broken? Then decide whether it's cheaper to buy/build a new replacement, or simply repair. Are you looking to buy a new computer? If a new release is imminent, sure, wait a month for it. But don't spend your life waiting for tech that's months or years away. You'll eternally live a miserable life.

As for this supposed technological stagnation, I think it's largely a myth. In the recent past, we were held back by the transition from desktop to browser interaction, along with slow replacement of computers with weak graphics capability. Now that the majority of consumer machines have 4+ discrete cores, all running at acceptable speed, and SSDs are overtaking primary usage, a huge amount of our older wait times have disappeared.

>random computation

What's random computation? Sounds like an oxymoron.

Let's put it this way, Quantum Computing can't be used for specific calculations and straightforward computation.
It's more of a support tool for tasks that rely on arbitrary data.
Think about why it's called Quantum Computing and the nature in which data is formulated, and then think about read/write of such data. You'll figure out the limitations then.

Better yet, ask on /sci/ since they will know how to relay this better in words.

I don't think you have any idea what you're talking about.

Very good point.

Also, I don't know why OP thinks 600/700 series GPUs will be good for the next 8-10 years. They're not even good enough to run current games at good settings with a good resolution and framerate.

I have precisely the idea of what i'm talking about, because if i don't, then that means that some renowned scientists don't know either because some random user on an imageboard said so.
As i said, ask on /sci/, they'll know how to relay it to you.

Or just read this:
medium.com/quantum-bits/so-when-can-i-play-video-games-on-my-quantum-computer-a8219fc75ffa#.udd8cykxj

>who here 2500k/2600k master race?
2500k masterrace checking in

2500k reporting in!

>I have precisely the idea of what i'm talking about

No, you don't. You're really badly misinterpreting and butchering the statements of others.

> because if i don't, then that means that some renowned scientists don't know either because some random user on an imageboard said so.

No, it means you haven't understood what you've read, and you certainly haven't understood what you wrote.

>Or just read this:
>medium.com/quantum-bits

Tell you what, how about you provide the excepts from that which support your claims:

>Works only on tasks whose nature is random computation.
>Let's put it this way, Quantum Computing can't be used for specific calculations and straightforward computation.

Moore's Law is dead and you are correct.

The good news is we have a phenomenal amount of horsepower if we could just fire Pajeet and get some real engineers to produce tight code again.

>Consumer electronics are inherently "deflationary".

This was driven by Moore's Law. That law is dead. We're scrapping a few minor process improvements out of silicon, but the 2x every 18 months meme is long gone.

He is butchering the reasons why, but his general claim is correct. Quantum computing is not the "next big thing" outside of encryption and some forms of simulation. It's not applicable to, say, a gaming console or a web browser.

Optics or graphene are probably the next big thing and they are years away.

2500k and I'm not going to upgrade it in the near future.

>Optics or graphene are probably the next big thing and they are years away.
wish this meme would die
STACKED
is the future you fucking retards

But that won't be until after non-silicon, at least

i7-2600k here. Paid 270 euros for it 5 years ago. It's ridiculous to think how well it fares against newer CPUs. Soldered heatspreader is also quite nice for overclocking.

>It's up to software devs to increase performance now.
Yep

We're fucked

>socket so big they need 2 molds
Jesus

Nope, tech is exponentially advancing like always.

10 years from now our processors from today will be jokes found in $20 smartwatches found in wallmart stores.

We'll likely have optical processors by then too.

We can still make shit smaller, and with good enough cooling we can use more area and volume up to a point. The next Skylake-E processors are going to be using this: