Google’s Quantum Supremacy

by ayushkhaitan3437

The paper that I’m going to talk about today is the famous Quantum Supremacy Using a Programmable Conducting Processor. This was published in Nature in 2019, and demonstrated Quantum supremacy for the first time in human history. It is one of the best-written papers I’ve ever read, and is more helpful for understanding the definitions and concepts involved than even expansive Wikipedia entries. Its findings have been disputed by companies like IBM, and mathematicians like Gil Kalai.

But first, a primer in quantum measurement. What exactly is it? If you’ll allow me a bad analogy, quantum measurement is like picking out an M&M from a bag of M&Ms. You never know what color you’re going to pick out. The bag may contain either an equal number of M&Ms of all colors, or it may contain a lot of blue ones, and very few yellow ones (the different physical states, of which the quantum state is a superposition, might have different probabilities, or said with a top hat, the different eigenstates might have different corresponding eigenvalues). So although it is more likely that you’ll pick out a blue one, there is still a real possibility that you might pick out a yellow one. Hence, it doesn’t make sense to ask what is the color of M&Ms in a bag. It is the (weighted) superposition of all the colors present in the bag.

Quantum Supremacy

So what exactly is quantum supremacy? Let me offer another terrible analogy: there are some humans that run slow, and some, like Usain Bolt, that run very fast. Although Usain Bolt holds the record for being the fastest human on Earth, we expect that in the future, there will be even faster humans who will break even those records. However, regardless of how fast humans may run in the future, they will never be able to outrun jet planes. Similarly, quantum supremacy is achieved when we can prove that a quantum processor can perform a certain task much faster than any classical processor, regardless of how brilliant the creator of that processor is, and how amazing the algorithm that it uses is.

But how do we find such a task, where we can be sure that brilliance will not help in solving a problem faster than a quantum processor? An example would be suppose a person quick with mental calculations is given a string of randomly generated numbers, and asked to square them. If the numbers were not randomly generated, say they were all of the form 10^n, this would be easier for a person who recognizes that the square of 10^n is 10^{2n}. However, because they are randomly generated, there is no pattern to the numbers. Hence, regardless of the mental faculties of that person, taking the squares of a million numbers, say \{247,3691,47385,\dots\} will still take him days (if not months and years). However, plugging all these numbers into a supercomputer will give you the answer in seconds. Hence, as no amount of brilliance will help that man calculate the squares of a million randomly generated integers in seconds, we should find a task in which the data is randomly generated, so that there are no patterns in the data, and hence no amount of algorithmic genius should enable a classical processor to perform it as fast as a quantum processor.

The task that the researchers here have chosen is the following, which I’ll again explain with the M&M analogy: your job is to make a bag with 53 (as 53 qubits are used in the experiment) different colored M&Ms. There may be one green M&M, 5 red M&Ms, etc. The numbers don’t have to be the same. However, there is one caveat: the number of blue and yellow M&Ms is much higher than the number of other M&Ms. Hence, if you were to pick an M&M, the probability of picking any color is well defined, and you’re much more likely to pick a blue or yellow M&M than other other M&M.

Now imagine that your friend doesn’t know how many M&Ms of each color are there in the bag. He is asked to pick a certain number of M&Ms from the bag, until he can write down a probability distribution (calculate the probability of picking each of the 53 colors from the bag). Also, the probability distribution should be “close” to the actual probability distribution. Suppose that we calculate that if the probability distribution that your friend writes down is “sufficiently close” to the actual probability distribution, he needs to pick out at least a 1000 M&Ms. Because he doesn’t have any prior information of the contents of the bag, he cannot use any clever techniques to calculate this probability distribution. He has to just keep picking out M&Ms until he has a good idea of the contents of the bag. How much time does it take him to do this? Your friend may take, say, 10 minutes. But your quantum friend can do it in 1 millisecond.

The task given to the quantum processor is this: a quantum state, which in this case is a superposition of 2^{53} discrete states that look like |0011001\dots\rangle, is measured. Much like an M&M that you picked up from the bag cannot have more than one color, this measured quantum state projects to one of the 2^{53} states that it was a superposition of. Now the quantum state that we initially constructed has certain probability distribution, which tells us the probability that it will project to a given state. For instance, the probability that this quantum state will project to |011100\dots0\rangle might be 0.5. We want to measure this quantum state a few million, so that we can approximate the probability distribution of the quantum state to within a given degree of accuracy.

Building a processor

How do you produce randomly generated data though? The authors create random quantum circuits with one and two qubit-gates. Qubit gates, much like logic gates, take in a quantum state, tweak it, and spit out another quantum state. They can be thought of as reversible functions that map quantum states to quantum states.

What a quantum circuit does is that it takes in a quantum state (a superposition of discrete states), and gives us another quantum state. In this case for instance the quantum circuit takes in a quantum state (which is a linear superposition of the 2^{53} discrete states), and modifies the amplitudes (for instance, it might make the amplitude of |000\dots 0\rangle equal to 0.999, and make the amplitude of |111\dots 1\rangle equal to 0.001. Hence, after passing through this quantum circuit, the quantum state that was input is much more likely to be observed as |000\dots 0\rangle rather than |111\dots 1\rangle. Also, what it does is that it highly entangles certain qubits with the help of two-bit gates, which basically implies that certain quantum states will be much more likely to appear than others. An example is that if the first two qubits are entangled in a certain way, then we can be sure that |01\dots\rangle and |10\dots\rangle can be observed, but |00\dots\rangle can never be observed. In fact, the authors emphasize that the probability distribution is exponential: some discrete states are exponentially more likely to be observed than others.

But what is a random quantum circuit? It is a circuit in which the one-qubit gates are randomly selected. In essence, we want to make sure that we are randomly changing the data, via randomly manufactured circuits. Because the generation of data is random, a classical processor can’t use clever algorithms to calculate the sample distribution. It has to do what everyone else is doing, take a sample, count the number of each state, and then write down a probability distribution. Now it is all down to speed, and not cleverness.

What the word fidelity means is how close the probability distribution that the processor writes, is to the actual probability distribution. If the fidelity is 0.000001, then the probability distribution is way off, and if it is 0.99, then it is very close to the actual probability distribution.

Recall that we have to first build a quantum state, and then put it inside the quantum circuit, which we then randomly changed after a few million observations. How do we build this quantum state? We first build a superconducting circuit, which implies that current and voltage follow quantum rules. Each of the 53 qubits is encoded with the lowest two eigenstates of the resonant circuit, which is just a fancy way of saying that each qubit is a superposition of two states, and projects to one state when a resonant frequency is passed through the qubit. But what about qubit gates, the parts of the quantum circuit that modify the amplitudes of the eigenstates? We also “benchmark” them, and ensure that they do what is expected of them (within an acceptable range of error).

After quantum states have been generated and quantum circuits have been built, we want to take a sample, and see if we can write down a probability distribution that has a fidelity of at least 0.002 as compared to the actual probability distribution. 0.002 seems low, but taking enough samples is that hard. The authors calculate that for 53 qubits, and a quantum circuit comprising of 1,113 single qubit-gates and 430 two-qubit gates, a fidelity of 0.002 can be achieved by taking a few million measurements. The authors contend that taking these many measurements will take a classical processor 10,000 years, while it takes their quantum processor 200 seconds.

Cross-checking and benchmarking

But how do we know what is the actual probability distribution? If a classical processor takes 10,000 years to calculate an approximate probability distribution, how does it calculate the actual probability distribution in finite time? This is done by taking the quantum circuit constructed above, and simplifying it a little bit such that the probability distribution doesn’t vary much, but calculating the probability distribution becomes much simpler. But wait. We thought the quantum circuits were randomly generated. If the classical processor can do this simplification, why does it not do this, instead of taking 10,000 years for sampling, and then calculating an approximate probability distribution?

This is because the arrangement of qubit-gates is not known initially. It is only after all the sampling has been done that the positions of qubit gates are observed, and then simplified so as to be able to calculate the “actual” probability distribution.

Hence, we have a task at hand that takes a classical processor 10,000 years to accomplish, but takes the quantum processor Sycamore 200 seconds to accomplish. More impressively, the actual sampling part only takes 30 seconds.

Applications and Future Directions

The authors contend that some immediate applications of this quantum processor are in materials science, machine learning, and optimization. They expect that quantum computational power, relative to classical computational power, will grow at double exponential speed: adding qubits will not make quantum processors that much slower, but will make their classical simulation exponentially slower. Moreover, Moore’s law will hold for components of quantum circuits. Hence, quantum computing capabilities can be expected to double every few years. However, quantum error correction has to be vastly improved, and many more qubits need to be added, before we can perform quantum algorithms like Shor’s algorithm, which can be famously used to crack any encryption.

References

  1. Quantum Supremacy Using a Programmable Superconducting Processor,
  2. Quantum Circuit