As data grows exponentially in this Big Data era, the processing power of today’s computers doesn’t quite cut it anymore. The rates at which binary code flows through systems are becoming faster than the processing ability of traditional computers. This is where quantum computing comes in as an assisting tool and with the potential to make an already connected world even more streamlined.
Quantum computing is fundamentally different from classical computing in that it can predict where zeroes and ones will land, and so speeding up processing time. It does this by leveraging superposition and entanglement to create a qubit, the basic unit of quantum information, instead of binary bits to perform calculations. While a traditional bit consists of zero and one to create information, a qubit relies on superposition and entanglement to create information. Superposition is the quantum mechanical principle that allows a qubit to "…exist in a superposition of zero and one (i.e. has non-zero probabilities to be a zero or a one) until measured"2. Entanglement allows us to predict the position of one particle based on the position of another within a quantum system. These two principles together allow the state of one particle to be measured, while the other particles seem to “know” that this measurement has occurred and take a position in the measurement, no matter how far apart the particles are at the time of measurement. In other words, a quantum computer can imagine a piece of information before it exists. This is Einstein’s "spooky action at a distance".
As we see greater amounts of data being created, the ability to quickly analyze and use that data to deliver insights will be key to business and societal advantage. The application of this technology will surface most quickly in areas where modelling the real world is core to the work. For example, the desire to create more custom chemical compounds in industries like agriculture and medicine will drive adoption of this technology. Simulations of the natural world, such as fluid dynamics and meteorology will also benefit and quantum computing will provide researchers and businesses with the capability to perform far more advanced calculations. This will lead to faster solutions to physical world problems like molecular, material, and chemical design.
On that note, material science and pharmaceutical research will benefit tremendously from quantum computing, as it dramatically increases the speed of molecular comparison, while decreasing the cost and time of developing these compounds via automated computing processes. By completing this early phase work more quickly and through a computational process, innovation becomes safer and more economically-sound2.
Cryptography is currently a cornerstone of today’s digital security and privacy infrastructure but this won’t be the case in the coming years. It will be completely disrupted, as traditional cryptography relies on prime number factoring, which has been solved by Shor’s algorithm3, but cannot scale with traditional computing. Infact, it is estimated that quantum computers advanced enough to break existing encryption structures may be available by the end of 20304.
While the large technology players are deploying programming languages and technologies to lay the groundwork for quantum computing5, smaller start-ups such as Rigetti and D-Wave are already delivering a variety of quantum computing offerings in the cloud as “as a service” offerings for business and research6,7.
While Google recently announced it had achieved “quantum supremacy8” (the moment a quantum computer can perform work that a classical computer cannot), industry estimates still put quantum computing 10 to 15 years out for broader adoption. With that said, revenues from quantum computing are estimated to increase 80-fold in the next 10 years9.
However, there are still limitations in quantum computing, including the ability to handle errors and maintain quantum states for extended periods. Further, many of the components necessary to build quantum computing hardware require extremely rare materials that are difficult to source and bring to production10. As a result, the technology is currently prohibitively expensive, but as advances are made in construction and operation, the price can be expected to decrease. On the utilization side, a new computing paradigm will need to be developed and learned by programmers and scientists. This will slow initial adoption and keep initial development resources relatively expensive.Contributors
Main author: Chris Pelsor
Editor: Tiffany Hildre