Quantum Computing is really good at solving certain things (like really efficient search algorithms (Grover Search can find an element in an unsorted(!) list in O(sqrt(N)), for example. Or application in factorization problems, which is why some cryptographers freaked the fuck out back then) but are not that great in other tasks like for example normal file-system shit where you'd need to work around the no-cloning theorem. And tasks where data integrity of long sequences is a problem because the models of quantum computers (at least to my a bit outdated knowledge) are all probabilistic models (which is not surprising given the nature of Q-Bits), which you'd need error-correction mechanisms just for ensuring that everthing remains stable (Q-Bits are a lot less stable in general, this might get fixed over time, who knows. But right now, the rule of thumb is: The more gates you add, the more potential noise you introduce into your probability distribution, the more shit you need to add to denoise/stabilize it).
I think, as with many other things, this topic gets overhyped to hell and back.