• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Quantum computing will turn everything upside down

West Texas CEO

GAF's Nicest Lunch Thief and Nosiest Dildo Archeologist
But, as a German, does this make your eye twitch wildly?
iu
You're taking things a little bit too far, don't you think?

Have some respect.
 

Pagusas

Elden Member
Are we still in the beginning “over promise, under deliver” stage of this tech? Looking forward to the inevitable reality crushing real world usage followed by the eventual achieving of 80% of the expectations a few decades later.
 

Irobot82

Member
Unless LK-99 is a superconductor, not much will change. To achieve a quantum computer currently you have to have near 0 temperatures. It's not like this will be in our handheld devices anytime soon.
 

Dice

Pokémon Parentage Conspiracy Theorist
Wouldn't the first country to develop this have the capacity to decrypt all information of any country that can be reached by the internet? Digital barriers would stand no chance, you could just straight up get taken over or at the very least completely exposed.
 

Kadve

Member
Quantum computers are cool and all. But we have yet to solve the problem of quantum decoherence (long story) in a way that doesn't involve us cooling the system to near absolute zero using the (on earth) ultra-rare Helium-3 and relying on super conductors. Until then they will be far from practical and spend more computing power correcting their own errors than what we can actually use.
 
Last edited:

Hudo

Member
Quantum Computing is really good at solving certain things (like really efficient search algorithms (Grover Search can find an element in an unsorted(!) list in O(sqrt(N)), for example. Or application in factorization problems, which is why some cryptographers freaked the fuck out back then) but are not that great in other tasks like for example normal file-system shit where you'd need to work around the no-cloning theorem. And tasks where data integrity of long sequences is a problem because the models of quantum computers (at least to my a bit outdated knowledge) are all probabilistic models (which is not surprising given the nature of Q-Bits), which you'd need error-correction mechanisms just for ensuring that everthing remains stable (Q-Bits are a lot less stable in general, this might get fixed over time, who knows. But right now, the rule of thumb is: The more gates you add, the more potential noise you introduce into your probability distribution, the more shit you need to add to denoise/stabilize it).

I think, as with many other things, this topic gets overhyped to hell and back.
 
Top Bottom