Article 1 - https://www.qcguy.com/blog-1-quantum-mechanics-the-basis-of-quantum-computing/

Article 2 - https://www.qcguy.com/blog-2-quantum-computing-revolutionary-and-not-just-evolutionary/

Article 3 - https://www.qcguy.com/blog-3-quantum-computing-the-inner-workings/

For readers who haven’t endured (yes, endured!) my previous three articles on QC, I suggest you go through them first as this article will require some basic understanding in the area of QM and QC and there is sufficient content shared in those articles to prep the readers for the topics I am going to discuss in this one. For those who have been brave enough to plough through the content in my previous articles, have learnt a few weird and wonderful characteristics of the quantum world. Such as, a subatomic quantum particle (e.g. electron) can be at more than one place at the same time or otherwise known as a wave state, a quantum particle can be in more than one state of being a wave and a particle at the same time which is also known as the superposition state, the behaviour of the quantum particle changes depending on whether it is being “observed” or not, how the properties of Quantum Mechanics (QM) are being exploited to create a whole new way of computing called Quantum Computers (QC), two entangled quantum particles know of each other’s state changes instantaneously irrespective of the distance between them, and many more characteristics.

In this article, I will be talking about the various practical usages of QC. These usages are categorised into two types, viz. extension or uncharted.

Extension - QC usages that are classified as extension mean that those particular use-cases aren’t possible today in classical computers due to the lack of speed they offer and hence QC acts as an extension of the capabilities provided by today’s classical computers due to the interference and parallelism properties of qubits that can be exploited using a QC.

Uncharted – QC usages that are classified as uncharted are the ones where they are beyond the capability of classical computers and can only be solved/run using a QC. These are the type of use-cases where it is required to mimic nature’s behaviour to arrive at a solution.

In this two-part article, I will be touching on a few practical implementations of QC that we can currently think of. However, just like any other technology, there will be many more use-cases that we can’t even begin to imagine today but they exist and over the course of next few decades when QC becomes more practical and mainstream such use-cases will become apparent. Take Blockchain for example. Satoshi Nakamoto (whoever or whatever that may be) in 2008 came out with a white paper (https://bitcoin.org/bitcoin.pdf) where he discussed this technology to create a fiat currency called Bitcoin. However, it took a couple of years before the technologists of this world realized that the underlying technology used in Blockchain, Distributed Ledger Technology (DLT), is a game-changer and slowly there were other usages of DLT that would have been unimaginable even a few years ago that started springing up. Now some 12 years down the line while Bitcoin still exists (and made some people very, very rich), DLT is being used in use-cases such as financial transactions, large-scale trade-related insurance underwriting, supply-chain management, amongst other interesting implementations.

So, without further ado, here are some of the practical QC use-cases that I believe are important to know and understand in order to build further on the information gained in the past 2 articles.

True Random Number Generator (Uncharted)

I will start with a short story about Paul Michael Larson, who was an American contestant on the television game show Press Your Luck in 1984. Larson is notable for winning $110,237 (equivalent to $271,000 in 2019) in cash and prizes, at the time the largest one-day total ever won on a game show. And why this is important for our topic at hand is because of what Larson did to win that large pot of cash in 1984. He found a pattern to the “randomiser” used in the games “Big Board”.

Random Number Generation (RNG) is a seemingly trivial thing to talk about but randomness, true randomness, is in-fact the cornerstone for a lot of what happens in both the digital and real life. And in both cases, entropy is the source for randomness. In the real world, entropy is the arrow of time inducing randomness and movement from order to disorder. In the computing world, entropy is the input (the seed) to the RNG (either software of hardware). This input is gathered from observing things such a computer mouse movements, keyboard clicks, outside temperature or even microwave radiation of the universe. The higher the entropy the more convincing is the randomness of the RNG using it as a seed.

For example, many systems use the cosmic microwave background radiation (the electromagnetic radio waves still leftover by the big bang) as a seed (https://tinyurl.com/s57nzja) since nobody can predict how this background noise will behave at any given point in time. A randomness system using an unpredictable seed like the microwave background is at this point totally random by today’s knowledge. Nobody currently can predict how this seed will behave at any given point in time, and so cannot predict the random number it generates. This type of seed may even be impossible for us to predict in the future, but if we have the same measurement of the cosmic microwave background as somebody else and use it as an input to the random number generator we will be able to predict their result.

The security of many forms of encryption rely on the computational difficulty of solving certain problems. For example, factoring is difficult with a classical computer, so some types of encryption are based on the difficulty of factoring, and you can have an encryption key that's the product of two large prime numbers. If one of the prime numbers is reused in another encryption key, that breaks the security of the algorithm. Now, this shouldn't be a problem because if prime numbers are selected randomly, the probability of two keys using the same prime should basically be zero. But in 2012, researchers found that if they looked at all the publicly accessible keys for RSA, a commonly used public-key cryptosystem, about a quarter of a percent of the time, two unrelated keys shared a prime factor. The authors speculate that the lack of entropy or randomness in one of the inputs to the algorithm resulted in this repeat of factors, which essentially made those systems completely insecure. Beyond public key encryption, there are many other cryptographic tasks that require randomness, such as challenge-response protocols, like what you use at an ATM or digital signatures. Modern-day cryptography relies on random numbers, and they're also important for scientific applications, like for Monte Carlo simulations and lotteries.

Image result for random number generation

True randomness is difficult to realize. To generate a truly random number we need to find something in nature that we cannot perfectly predict, something that can only be described by probability. Until as recently as the early 20th century this was believed not to exist, but the theory of QM changed all that and opened doors to an even stranger and unpredictable world. Since QCs are built on the fundamentals of QM, they can manipulate the superpositions of particles (Qubits) which are governed by probability, thus acting as perfect tools to harness the nature of the quantum world and build a true random number generator.

One of the most promising areas of research in this area is in quantum random number generators (QRNGs), which rely on either superposition of quantum states or quantum entanglement to create randomness. A simple example of how this works on a single system is to prepare a single qubit in the superposition state given by |ψ⟩=12√(|0⟩+|1⟩)  and then making a measurement in the  {|0⟩,|1⟩}  basis. Since |ψ⟩ is an even superposition of both basis states, the outcome of the measurement cannot be predicted, even in principle, and is, therefore, a source of random numbers. To further guarantee the randomness of numbers, techniques known as “self-checking” QRNGs have also been developed.

Solving the Nitrogenase Problem (Uncharted)

Bacteria use nitrogenase to fix atmospheric nitrogen into ammonia. And why is that important for our discussion in a QC related article you may ask? Well, that is because ammonia forms the foundation for the nitrogen fertilizer industry. Without fertilizers there is absolutely no way of producing enough food to feed the world’s 7+ billion mouths. But the process we recreate artificially to manufacture fertilizers for the agriculture industry is estimated to consume as much as 3% of all energy produced globally.

The key step in manufacturing fertilizer is a process called nitrogen fixation. In industry, nitrogen fixation is performed using the Haber Bosch process, which requires both high pressure and high temperature. The process, therefore, uses a lot of energy and produces an abundance of greenhouse gases. Because of this, the carbon footprint of a loaf of bread—from growing and harvesting the wheat to baking the dough—is about 590 g, 40% of which is carbon dioxide emitted during nitrogen fertilizer production. If chemists knew the details of the nitrogenase mechanism—and more about how transition metals behave in general—they could make better, more efficient catalysts for producing fertilizer, as well as renewable fuels and many other industrially important chemicals.

In contrast, there exist bacteria which use an enzyme called molybdenum nitrogenase, which, even at room temperature, can catalyze atmospheric nitrogen into ammonia.

How do the bacteria do it then? We don't really know precisely. We do know the key component is an iron molybdenum (Fe-Mo) cofactor, a chemical compound that acts as a helper molecule for the enzyme. However, it's not known how the reaction works in detail. Simulations with classical computers, of course, exist but only provide approximate answers. If we could actually study that process and understand it, then we could potentially engineer a catalyst that makes this reaction efficient. Now to study this with classical computers, this is a lifetime of the universe time scale to get a solution or in other words, classical computers will require 13+ billion years to solve the nature problem of nitrogenase reaction. On the other hand, an 80 qubit QC can solve it in days by performing a quantum simulation of the chemical reaction steps that would provide the reaction energies which are involved in the nitrogen fixation.

Breaking Cryptography or Making it Stronger (Extension)

  1. Cryptography – the current lay of the land -

The Enigma machine is a legendary cypher machine used by German forces before and during World War II. Enigma is an electro-mechanical rotor cypher machine. A message – called "plaintext" – written by an operator is encoded by a machine – similar to a typewriter in appearance – in the following manner: a single character is pseudo-randomly replaced with a different letter according to the wiring of the machine. The subsequent letter is replaced following the same mechanism, but with the rotor shifted to a different position. The shifted rotor position causes the letter to be substituted according to a different electrical pathway and hence character substitution table. Thus, each character of the message is replaced according to a different encoding scheme. The resulting "ciphertext" can subsequently be transmitted via regular public communication channels, and it can only be decoded using a machine with the same configuration and capability to reverse the encoding. Finally, a successful decipher trial reveals the true content of the message in the form of a plaintext again.

https://upload.wikimedia.org/wikipedia/commons/thumb/3/3e/EnigmaMachineLabeled.jpg/800px-EnigmaMachineLabeled.jpg

Figure: German Enigma Machine

In 1941, at Bletchley Park, a British team including Alan Turing uncovered the working principles of Enigma for the Allies. The deciphered information of the intercepted messages provided the Allies with a significant advantage that ultimately became a key element in their victory over the Axis powers.

The Enigma machine is an example of symmetric encryption scheme, where the same key is used to encode and decode a message. Symmetric encryption schemes are still in use today. However, since World War II, cryptographic key generation has evolved from electro-mechanical methods to classically constructed digital bit strings serving as encrypting keys. The keys are mathematically difficult to derive and require a significant amount of computational power and time. The computational power and time required to decode determine the security of an encoding key.

In 1975, the Data Encryption Standard (DES) was introduced by IBM. The DES algorithm generates a 56-bit symmetric encryption key using a complex mathematical method. Despite this mathematical complexity, DES was known from the beginning to be potentially susceptible to a brute-force key-search attack due to its relatively limited key size. In fact, by the beginning of the 21st century, computational power had sufficiently advanced to the point that one could determine the key in less than a day. For example, Distributed.Net – an organization specializing in the use of idle computer time – utilized more than 100,000 computers ranging from slow PCs to powerful multiprocessor machines to test 250 billion keys per second and thereby determine a particular 56-bit key in about 23 hours. Given the increasing vulnerability of DES, in 2001, DES was replaced by the Advanced Encryption Standard (AES), a more secure symmetric key cryptography scheme.

However, there remained a conundrum: how can two parties exchange a symmetric key to initiate secure communication, before they have a secure channel on which to send it? In 1976, a new method for distributing cryptographic keys was introduced by Whitfield Diffie and Martin Hellman – referred to as the Diffie-Hellman key exchange protocol – which heralded the era of public key cryptography with asymmetric keys. Asymmetric keys comprise a public key – used to encrypt messages – and a mathematically related (but unique) private key – used to decrypt messages. The security of public key cryptosystems is premised on the use of one-way mathematical functions to create the public and private keys. One-way mathematical functions are straightforward to calculate in the forward direction but prohibitively complex to calculate in the inverse direction.

Figure – One-Way Functions. Easy to solve given p and q but exponentially difficult to derive p and q given N.

A widely used public-key distribution algorithm was published in 1978 by Ronald Rivest, Adi Shamir, and Len Adleman – referred to as the RSA algorithm. It was later declassified in the late 1990s that asymmetric encryption schemes like RSA had already been developed by the British General Communications Headquarters (GCHQ) in the early 1970s. With public key cryptosystems, one could exchange public keys and thereby establish a secure link, without having symmetric keys in hand at the beginning. Once a secure link was established, one then has the option of continuing to communicate using asymmetric keys, or one can alternatively use this newly established secure link to exchange symmetric keys and use symmetric encryption/decryption protocols.

In practice, public-key encryption techniques are primarily used to establish a secure channel and distribute symmetric keys, which are in turn used to encrypt messages. This is because encryption using an asymmetric-key protocol is generally less resource-efficient than using a symmetric-key protocol. For example, a 128-bit symmetric key has 2128 different possible key combinations. In contrast, a 3000-bit asymmetric key can leverage only the fraction of the 23000 possible bit strings that comply with the conditions required by the underlying one-way function. Therefore, whereas a symmetric key of 128 bits and a 3000-bit public key afford a similar level of security, the symmetric-key approach is generally more efficient to implement.

Modern cryptography is a foundation for today's information and economic security, enabling secure communication between individuals, businesses, and governments. Most of this information is encrypted using some sort of public-key encryption schemes.

The workhorse of public-key encryption schemes is the RSA cryptosystem, which can be broken into four steps:

  1. Key generation (a public key  {e,N}  and a private key  {d,N} ),
  2. Public key distribution,
  3. Message encryption (a message  m →  a ciphertext  c ) and transmission,
  4. Message decryption.

I will dive further into the inner workings of the RSA encryption method as it is important to understand to better appreciate the impact QC will have on such an important aspect of cybersecurity.

Lets take a scenario where Alice and Bob want to communicate securely with each other, and then there is Eve who wants to cheeky and drop-in uninvited on this secure conversation.

The first step is to create the public and private encryption keys that will be required by both Alice and Bob to encrypt and decrypt the secure comms. The steps below are a summary of a simple example of how keys are generated when the two random prime numbers, p and q, are 17 and 19 respectively.

Having created the keys, Alice sends the public key {e,N}  to Bob. Importantly, she retains the private key {d,N}   and does not reveal the values  p  ,  q  , or  r .

Bob has a message m he wants to send to Alice. Having received the public encryption key {e,N}, he encrypts  m  to create cyphertext c using the public key and modular exponentiation:

After the message is sent to Alice, she decrypts it with her private key {d,N} using the properties of modular exponentiation,

Provided the original message m is smaller than the modulus N. Eve has access to the public key {e,N}, but this is no use to her in trying to decrypt the cyphertext, since

In summary, the number theory and properties of modular exponentiation underlie the security of the RSA cryptosystem. As you have seen, the challenge of order finding can be related to the problem of factoring. While it is straightforward to multiply two numbers p and q to give the modulus N, the inverse problem is hard on a classical computer. And, if N is chosen sufficiently large, then a brute-force search to find p and q is also inefficient. Thus, the RSA cryptosystem is believed to be securely provided with the prime numbers p and q, and the private exponent d is not made public. However, the encryption exponent e and modulus N can be made public. This type of cryptosystem, where the encryption key is public and anyone can encrypt a message, while the decryption key is private and only the receiver can decrypt the cyphertext, is the backbone of our modern-day communication systems.

As we just saw, if one knows the prime factors p and q, then one can efficiently find the modular exponentiation period r, and thus generate encryption keys e and d. However, if we're just given the modulus N, there is no known efficient classical algorithm that lets us find its prime factors p and q. And so there is no efficient means to determine the period r under these conditions. Thus, even if we reveal the modulus N and the public encryption key e, there are no known efficient means for Eve to calculate the private key d, that is, on a classical computer.

Breaking Cryptography -

Now that we have some basic understanding of the RSA cryptosystem let’s move onto how QC and specifically an algorithm running on a sufficiently large QC can effectively break this encryption method.

The QC algorithm in question here is the Shor’s factoring algorithm.

Consider the previous example scenario where Eve wasn’t able to break into the private encrypted conversation between Alice and Bob using a classical computer. However, if Eve has access to a quantum computer that can run Shor's Algorithm, then she can efficiently find the modular exponentiation period r. This is called order finding, and Shor's Algorithm finds the order r of a number a with respect to the modulus N, where a is the number that is raised to integer powers under modular exponentiation. With a means to efficiently determine r, Eve can calculate the prime numbers p and q and then derive the decryption key d, thus compromising RSA.

The Shor’s algorithm is run using a combination of classical and quantum computers. The pre and post processing of data is done using the classical computer while the main processing of the data is done using a QC. The quantum part of Shor's algorithm is called “period finding."

https://upload.wikimedia.org/wikipedia/commons/thumb/6/6b/Shor%27s_algorithm.svg/1920px-Shor%27s_algorithm.svg.png

Figure: “Period Finding” – A Quantum subroutine in Shor’s algorithm.

The figure above shows graphically how a QC programme looks like for Shor’s algorithm’s quantum subroutine. For this article, its not important to understand what each of the blocks mean but just that they are quantum gates that perform specific operations. At the heart of this subroutine is the QFT block.

The Quantum Fourier Transform (QFT) is a quantum version of the classical discrete time Fourier Transform. The classical Fourier Transform takes a vector of numbers, x-- often a digital sampling of a signal to be analyzed-- and transforms it to a vector of numbers, y, in the frequency domain. Periodic behaviour in the time domain is more easily identified in the transformed frequency domain. For example, a sine wave that oscillates in time with a specific frequency will transform to a large amplitude spike at plus or minus that frequency in the transform domain. Thus, it's easy to identify the location of the spikes and then read off the frequency directly. And if the signal's frequency is known, then we also know its period, since the frequency is 1 over the period. Now, the Quantum Fourier Transform is essentially the same transformation, which is why it is being used here for period finding. It takes a state j and transforms it to a superposition state with specific phase factors. If we start with a superposition state with coefficients xj, the transformed result is also a superposition state with coefficients yk, where the probability amplitudes yk are the classical discrete time Fourier Transform of the probability amplitudes xj. And due to quantum parallelism and quantum interference, implementing the Quantum Fourier Transform is exponentially faster than implementing the classical fast Fourier Transform algorithm.

If the above statement is too technical, which it is, then don’t fret. Just know that QFT sits at the heart of Shor’s algorithm and that there are QC programming languages that can easily be written to perform the required tasks.

Figure: Shor’s Algorithm for a two qubit system.

Making Cryptography Stronger -

The long-term security of modern cryptographic systems has been brought into question by developments in quantum computing. The existence of Shor's factoring algorithm shows that the inversion of the one-way functions needed for asymmetric cryptography is an instance where quantum computers have a quantum advantage over classical algorithms. It is worth noting that the development of quantum computers is not thought to be catastrophic for symmetric-key cryptography schemes. Now, while acronyms like RSA and AES might not mean much to most users of technology, these tools underpin the security of tools such as secure web browsing, VPN, secure email, auto-updates, blockchains, and so on, which are the basis of technological infrastructure, including the internet, cloud computing, IoT, and so on. So to reap the benefits of quantum computing, it is imperative that we first deploy the new cryptographic tools designed to be safe against quantum attacks. This is sometimes called quantum-safe cryptography or post-quantum cryptography.

Now, it's important to note that quantum computers capable of breaking RSA and Diffie-Hellman schemes in use today currently exist, but this doesn't mean we can just wait and see before doing something about it. Based on the pace of quantum computing research, and the effort required to update cryptographic standards, it has been argued by researchers that the process of finding and implementing algorithms which are suitably robust to quantum computers should begin immediately. Professor Michele Mosca, a leading expert in quantum computing research, has estimated the odds of a quantum computer capable of breaking RSA-2048 – an RSA cryptosystem using 2048-bit private and public keys – being constructed by  2026  and  2031  at  1/7  and  1/2  respectively. These probabilities are informed estimates based on of several parameters such as when a fault-tolerant and scalable qubit will be designed, how many qubits are needed to break RSA-2048, and how long it will take to scale quantum computers to this size.

It can be argued that based on these projections, quantum computers capable of breaking RSA-2048 are reasonably likely to be available before the shelf-life of information encrypted with RSA-2048 has expired. Both NIST and NSA have advocated that action should begin now to safeguard cryptographic standards from quantum computers. In 2016, NIST began requesting nominations for post-quantum public-key cryptography algorithms, with the eventual goal of standardising one or more quantum-resistant algorithms. Additionally, in 2015 the Information Assurance Directorate of the NSA announced it would begin transitioning to quantum-resistant algorithms.

The search for cryptographic algorithms and hardware which are secure against both classical and quantum computers is called post-quantum, or quantum-resistant, cryptography. At present, all post-quantum cryptography algorithms come from either quantum cryptography or by exploiting math problems, similar to prime factorisation, which have no known efficient solution on either classical or quantum computers.

Quantum key distribution (QKD) uses fundamental quantum-mechanical properties of systems, usually implemented with light, to securely exchange keys over an unsecured public channel. Unlike other solutions, the security of QKD does not rely on assumptions about the computational resources of an adversary and can be considered provably secure. Realistically, many QKD implementations are open to attacks on the hardware or software which perform the protocol. At present, QKD is a mature and active area of research with commercial systems available.

Alternatively, two of the most promising post-quantum cryptographic schemes are lattice-based cryptography and code-based cryptography. Lattice-based cryptography makes use of problems on a set of points in n-dimensional space which have a periodic structure; these sets are called lattices. An example of a lattice problem used in cryptography is the “shortest vector problem," which is based on the shortest non-zero vector for a given lattice. Code-based cryptography uses the properties of error-correcting codes to secure information. Intuitively, these systems consist of a public key – which adds random noise to a signal – and a private key – which is an error-correcting code capable of removing the noise. While it is still unclear which, if any, of these methods will become a post-quantum cryptographic standard, it is unlikely that the new approach will be compatible with existing systems. Rather, new hardware will likely be required, at significant expense, and therefore careful planning will be required before the current standards are updated.

In 2016 Google upgraded its Chrome web-browser to equip it with post-quantum cryptography algorithm named “New Hope”.

Disclaimer - The reading of all information on this article is of your own free will. If you do not accept these Terms and Conditions, you should cease reading this article immediately. If you do however want to read some awesome QC related articles, please don’t leave just yet.

I reserve the right to change any of these Terms and Conditions at any given time on this article.

Even though I work very hard to provide you with up-to-date information (through thorough research and eating lots of cookies while re-writing already published articles), I make no representations or warranties of any kind (expressed or implied) about the completeness, accuracy, reliability, suitability or availability of any information, products, services or related graphics contained on the QCGuy.com for any purpose.

I aim to provide you with accurate information at the time of publishing, but some information will understandably be less accurate as time passes. Should you find any inaccurate information, please do not hesitate to contact me. I will drop everything, get behind my “classical computer” and correct this world shocking mistake right away… or as soon as I finish my cup of tea.