Page Nav

HIDE

Grid

GRID_STYLE

Trending News

latest

Quantum Computers And Impact On Cyber Security

Threats and opportunities in the new realm of computation. Quantum computers and quantum information science use the basic properties of...

Threats and opportunities in the new realm of computation.
Quantum computers and quantum information science use the basic properties of nature to allow a fundamentally different type of computation. In this series of articles we explore this emerging field and discuss its impact on the cyber security landscape.

Are quantum computers going to break encryption and reveal all our secrets? Do they allow for massive parallel computation and will dwarf supercomputers? These type of questions are getting increased attention, from the occasional chat by the coffee machine all the way to prime time national media. It is on the agenda of the board of multinational companies and governmental bodies. Some see it as THE solution to all our problems while others fear it will be our demise.

In this series of articles, we discuss the potential impact of quantum computers on the cyber security landscape. We start in the present article with a general discussion about quantum computers and what they can be used for. In the following articles we will elaborate on the impact this new quantum technology has on cyber security. As a consequence, organizations should prepare for present and future risk which we will cover next.

Finally, we will discuss how to leverage the cyber security opportunities quantum computing might provide. The bottom line of this series of articles is: Don’t panic, be prepared instead. Quantum computers are not going to break cryptography anytime soon nor will they deliver incredible computational speedups soon. However, they may pose a real threat in the long term. As cryptographic implementations appear to be very difficult to change in practice as the MD5 and SHA1 migrations have shown. Therefore, it is imperative to start planning now and to take preliminary actions to avoid unplanned transformation costs in the future.

Quantum computers: How do they work and what can we expect from them?

The development of quantum theory, which started at the beginning of the 20th century, has been an enabler for much of the technology we now take for granted. Devices such as LED, silicon chips and solid state drives (SSDs), to name a few, are based on the principles of quantum mechanics. Presently, we are at the dawn of a new era where quantum theory doesn’t only help us to design better components for computers but also enables us to now start processing information in a fundamentally different way using these “weird” effects of quantum physics.

Quantum information theory and its potential implementation in quantum computers hold promise to solve computational challenges that are intractable with classical digital computers. In this section we explain the basic principles of how a quantum computer works, how it differs from classical computers, and discuss what kind of problems we expect quantum computers excel at.

What makes a quantum computer “quantum”?

The most important difference between a classical and a quantum computer is the fundamental unit of information - the “bit”. Whereas a classical bit can be in only one of two states (0 or 1), a “qubit” (quantum bit) can be in both states at the same time. This is called the principle of superposition and is one of the most fundamental concepts in quantum theory.

Unfortunately, a qubit in a superposition state cannot be directly observed. When measuring a superposition state the qubit will “collapse” into the 0 or 1 state in an unpredictable way. This probabilistic nature of quantum mechanics is very counterintuitive. Even Albert Einstein did not believe in this randomness of nature. Hence, he made his famous statement “God doesn’t play dice with the universe”.

How do quantum computers leverage quantum properties to perform computations?

To answer this question, let’s first look at the steps of a classical computation, with an example of calculating how much income tax an employee has to pay annually. The first step is to choose the input, in this case the annual salary on an employee. The second step is the actual computation where we use the input and other relevant values (constants) to calculate how much tax the employee needs to pay. The third, and last, step is present the output to the user.

Quantum calculations work in a fundamentally different way and also have different components. The first step in a quantum calculation is to prepare the input as a superposition of all possible states, rather than select a specific initial state. Sounds strange right? We illustrate how this works in the following way: Imagine preparing 3 qubits in an equal superposition state (50% zero and 50% one). To understand what this state represents we look at what happens if we measure the qubits.

Every time we measure this state, we observe a random collection of zeros and ones with an equal probability of finding 1 of the 8 possible classical states. This means that as long as this superposition state is not measured, the qubits are actually in all possible classical states at the same time. From a computation perspective this is a very interesting property as we can perform computation on multiple states simultaneously. This process is called quantum parallelism.
Figure 1: 3 qubits in a superposition state will maintain their superposition state as long as they are completely decoupled from any outside influence. will “collapse” randomly to one of the 8 classical states
The second step in a quantum calculation is to process the input using a series of logical gates. Besides the familiar logical gates like OR, AND, XOR etc., quantum computers have more complex gates that can create and process superposition states. In addition to gate operations, measuring the state of the qubits also affects their value. In fact, measurements of the qubits are also an indispensable tool in the computation phase. This leads to an interesting observation that a quantum computation, the process of performing the operations, must be a black box.

We are, by definition, not allowed to measure the state of the qubits during execution unless it is part of the script, as a measurement is actually an operation on the data itself. The third step of a quantum calculation is to produce the output, which is always done by a final measurement of the qubits. The state of the qubits before the measurement can be (still) a superposition state, but doesn’t have to be. If it is not a superposition state then the measurement will always yield the same result. In this case, the measurement actually produces the final output of the calculation.

However, if the final state is a superposition, then the final measurement will give different outputs at each measurement. In this case, the whole procedure must be repeated multiple times to determine which results have the highest probability. The output with the highest probability is then the correct answer of the calculation. Probably the most unintuitive aspect of quantum computations is that the same calculation can have different outcomes.

What kind of problems can be solved on a quantum computer?

Developing new quantum algorithms is an active field of research that has seen increasing growth in the past couple of years. Many companies and research institutes are trying to come up with new quantum algorithms that solve relevant computational problems better than classical algorithms. The main quantum algorithms developed so far are (for an exhaustive list see “quantum algorithm zoo”):
  1. Shor’s algorithm: factorization of large numbers and solving the discrete logarithm problem. This algorithm has the potential to break most of the currently used public-key cryptography, and will be the main topic of the following articles.
  2. Grover’s algorithm: searching in an unsorted list. This is a generic method that can be applied to many types of computational problems. The downside of this algorithm is that the speedup it offers is more limited than e.g. Shor’s algorithm. The results from this algorithm are not expected to be as spectacular as other algorithms, but are nonetheless important for some applications.
  3. Quantum Approximate Optimization Algorithm (QAOA): A general method to solve optimization problems under specific conditions. Many problems in finance, manufacturing, transport etc. can be formulated as an optimization problem, which shows the potential impact of this algorithm.
  4. Harrow Hassidim Lloyd (HHL) algorithm: This algorithm solves a linear system of equations. As linear systems are in the core of many science and engineering problems, the potential speedup presented by the HHL algorithm can have a major impact.
An important parameter to the success of a quantum algorithm is its performance compared to classical algorithms. From a theoretical point of view there are certain computational problems that are proven to be intractable with classical computers (or at least infeasible within a reasonable time). An implementation of a quantum algorithm that solves such a problem would be a great achievement. On a more practical level, a quantum algorithm will already be extremely successful if it performs better than a classical algorithm.

This creates a (healthy) competition between classical and quantum algorithm designers. Interesting examples of this competition are recent implementations of QAOA and HHL that seemed to outperform the best known classical algorithms. This success was of very limited duration as new classical algorithms were soon developed that perform even better. The bottom line is: even before the first demonstration of a quantum computer, the progress in this field is having a positive effect on the field in general.

So when do we expect quantum computers to be operational?

Building the hardware for a quantum computer is a formidable challenge. Qubits in their nature are very fragile and can lose the information encoded in them very quickly (the decoherence problem). The major challenge is to keep the qubits completely isolated from the environment while allowing high precision control and readout of the qubit state. To effectively decouple qubits from any noise source, and therefore sustain longer coherence times, these systems are typically cooled to extremely low temperatures using liquid helium.

This puts a heavy burden on the size of the system and results in high running costs. There are many different ways to realize qubits, e.g trapped ions, superconducting rings and many others. Each architecture has its advantages and drawbacks and it is not yet clear which qubit material is the most scalable one.

When trying to predict the future progress of quantum computers, the qubit count is often used as the primary indicator. This is misleading as there are other parameters that can inhibit the progress of quantum computers, even when the qubit count continues to increase. One of the biggest challenges is the need to perform 2-qubit operations on any 2 qubits, which is called qubit connectivity. Even in small quantum computer platforms that employ a single digit qubit count, such as the IBM Q experience, not all qubits are connected. This significantly reduces the ability to run complex quantum algorithms. Clearly, when scaling up the number of qubits, the qubit connectivity issue becomes even more challenging.

To predict when we can expect a quantum computer to be operational we also need to consider the hardware requirements of the specific quantum algorithm. These requirements can differ substantially depending on the type of calculation we want to perform. For example, to implement Shor’s algorithm for factoring a 2048 bit number we need more than 4000 stable qubits and billions of logical gates. As the duration of such a calculation will be much longer than the time that qubits can stay stable (coherence time), other techniques are then required to maintain the qubit information.

These methods, such as quantum error correction, are far beyond the current capabilities of quantum computers. Therefore, even the most optimistic (without being unrealistic) predictions estimate a period of 10 years before the first demonstration of Shor’s algorithm on a 2048 bit number. On the other hand, the QAOA algorithm can outperform a classical computer with 100-200 qubit and a very shallow circuit depth (loosely defined as the number of logic gate operations), so quantum error correction is not strictly mandated. This is also the reason the QAOA is one of the candidates to demonstrate quantum supremacy.

To summarize this section, hardware of quantum computers needs to improve on various fronts. The requirements of the various quantum algorithms are very different. Some algorithms are expected to be demonstrated in a year or two, while other algorithms may take many years or even decades to demonstrate experimentally.

Closing remarks

Quantum computers represent a paradigm shift in computation. We are entering a fascinating period in the development of quantum computers. Quantum systems are scaling up in both size and reliability and are getting close to showing a real advantage over classical computers. As this technology is still in such an early phase, it may be that its true impact is not even fully understood yet. This makes this field even more fascinating to follow.

In the next article we will focus on how quantum computers will potentially disrupt the cyber security landscape. We will present the threats and opportunities of quantum computers and analyze which domains are predicted to be mostly affected.