Will quantum computing make everything we know about computers irrelevant?

Will quantum computing make everything we know about computers irrelevant?

Other urls found in this thread:

phys.org/news/2016-08-programmable-ions-stage-general-purpose-quantum.html
twitter.com/SFWRedditGifs

One thing it's already doing is clearly separating the clueless popsci morons and actual people who know about technology.

quantum computing won't give me more fps so i don't care

i envision a quantum coprocesor used for brute forcing old passwords, and basically nothing else

Will your mom make everything we know about fleshlights irrelevant?

no because they can't even get it to work

Fuck quantum computers. I want my planck scale transistors.

no.
But since you probably don't know anything about computers, I don't see why you would care

There has been a lot of progress with quantum simulators. Any algorithm that requires 5 or less qubits can be implemented and some of them already were, such as QFT. Look here:

phys.org/news/2016-08-programmable-ions-stage-general-purpose-quantum.html

>physorg

Hate it only because I got hyped for graphene in like '07. Thought it would change everything within a few years.

Popsci blows sometimes

Quantum computing is nothing more than a novelty.
At best it might reach the point where it can replace ARM CPUs.

What's quantum computing. Explain in layman's terms pls

It explains the article in layman terms. It's not meme science.

Hold on, I need to get my flicker free, 4K crystal ball to see the future. Yes I can see it now:

quantum computers will evolve into small devices with ridiculous prices. The first models will be algorithm-first models. So you will have to know which algorithm (or rather type) you want to use before you buy one.

So most likely big companies + states will be brute forcing our HTTPS in real time, stealing ideas, using information to gain even more profit.

The breakthrough will come when Intel will combine normal CPU and add separate IntelQuanExtreme module. They will make subset of C (like with GPGPU) and some instructions will work directly in Q, some only partially.

So with heavy emulation and great research cost will Intel brings somewhat programmable Q chips. You know, programming atoms is different level than making transistors small like atoms.

And in the end people will buy i3 with nothing, i5 with fast cores, i7 with hypermemeing and i9 with Q. The programmers will struggle for years to actually develop faster algorithm for Q than heavily optimized already existing conventional ones. And I am afraid that games will be the last one to actually benefit from this technology, which will be the only remaining interesting things when I will finally reach age 75 to apply for pension thanks to the collapse of social system.

The Big Bang theory-tier science.

Bazinga

That was a good fap.

100% accurate


A solid decade or more of work needs to be done before they're anything even remotely feasible for the consumer market. Maybe we'll have some "close enough" quasi chips that are sort of display quantum speed up, sometimes, but can do it at room temps and without absurd magnetic shielding.

They(Google and their little pet team) are finding all sorts of new uses for the D.Wave systems they have. Its like we're in the early days when the first IC was just fabbed, and mathematicians are sitting around trying to figure out what the fuck to do with it. There will undoubtedly be a wealth of workloads suited to the hardware, but I don't see many of them being applicable to the consumer market any time soon.

If headlines are exciting, its bullshit almost entirely.
The world is mundane, writers and journalists twist things, omit facts, structure things in such a way to make them enticing and salacious.
Graphene is being used in hundreds of various labs around the world for everything from water filtering to batteries. The reality of this though is that all of it is horribly cost ineffective, done in small volumes, and years of work needs to be done for anything to get it ready to scale up to mass production. And thats before facilities and tooling are even built.

A writer will tell you that something great will be here any day.
A chemist will tell you its technically possible.
A team of engineers will tell you they need 7 years and several billion dollars before you realistically see practical results.
Theres a reason why you don't see engineers often quoted outside of sanitized PR statements.

>tfw no qt quantum phone with entanglement modem and neural interface

y live

>meme
Ugh, shut up.

Yes, I understand what the website is about - I just said I used to read it in '07.

"These graphene nanotubes could be used to suck the heat away from the processor resulting in a massive increase in possible voltage - the next generation of computing is right around the corner!"

Yeah, in like 50 years maybe.

>exciting headlines
I was young and dumb.

Sauce? i can't find it for some reason.

No. It's not like quantum computers perform computation that a Turing Machine cannot. They perform certain operations much more efficiently than other forms of computers but they can still simulated deterministically.
So no, the things we know about computers and computation will not become irrelevant. Certain forms of algorithms that ran in Ω(n^2) time on normal computers will run in o(n^2) time on a quantum computer (if you'll allow me to fudge the notation a bit)

well considering that you need 2 year math classes to even understand it. I don't think so