Explore Encyclopedia

Quantum Computing

Quantum computing leverages the use of certain quantum-mechanical particles and properties to achieve an exponential increase in computational power. 

Classical computing uses bits, or binary strings of zeros (0s) and ones (1s), to create the outputs to which we are accustomed in classical computing. Quantum computing uses qubits, or subatomic particles, and leverages two properties of qubits (superposition and entanglement) for supercomputing since, if successfully implemented, quitbits in unity can outperform even the most modern classical computing systems. Superpositioning’s use of bits that coexist simultaneously, and entanglement’s creation and daisy-chaining of paired bits, are reasons for this. 

Qubits are challenging to create and manage, making quantum computers’ use outside of research problematic and delayed. Qubits are prone to decay (called coherence) and to avoid this their dependency on a perfect, consistent state calls for large resources and an almost impracticable maintenance of controlled temperatures below that of space, or ultra-high-vacuum chambers. 

Taken together, the resource intensive nature of quantum computing makes them a decade or more away from their arrival ostensibly in the academic or defense sectors. Even then, their prohibitive expense and nigh-infeasible requirement of controlled settings makes them highly unlikely to make it into the commercial sphere.


“Some prognosticators in the security space say quantum computing is going to dramatically impact security such as by breaking all classical computing encryption. While there’s a grain of truth to this, in theory, quantum computing is a long way’s down the road and who’s to say quantum decryption won’t quickly give rise to quantum encryption?”