_{1}

^{*}

The development of artificial intelligence today is marked with increased computational power, new algorithms, and big data. One such milestone impressive achievement in this area is Google’s Al p haGo. However, this advancement is beginning to face increasing challenges and the major bottleneck of AI today is the lack of adequate computing power in the processing of big data. Quantum computing offers a new and viable solution to deal with these challenges. A recent work designed a quantum classifier that runs on IBM’s five qubit quantum computer and tested its performance on the Iris data set as well as a circles data set. As quantum machine learning is still an emerging discipline, it may be enlightening to conduct an empirical analysis of this quantum classifier on some artificial datasets to help learn its unique features and potentials. Our work on the quantum classifier can be summarized in three parts. The first is to run its original version as a binary classifier on some artificial datasets using visualization to reveal the quantum nature of this algorithm, and the second is to analyze the swap operation utilized in its original circuit due to the hardware constraint and investigate its impact on the performance of the classifier. The last part is to extend the original circuit for binary classification to a circuit for multiclass classification and test its performance. Our findings shed new light on how this quantum classifier works.

Quantum computation is a new computing approach based on the laws of quantum mechanics, which is a very hot topic at the frontiers of computing world today. By carefully exploiting the unique features of quantum states, quantum computers can efficiently solve some problems that are believed to be hard for classical computers.

There are two well-known quantum algorithms that demonstrate the advantage of quantum computation over classical approach. Shor’s algorithm for finding the prime factors of a number runs in polynomial time while any classical similar algorithms run in exponential times. Grover devised a quantum algorithm for the problem of unstructured search which achieves a quadratic speed up over the classical algorithms. Not available in classical computing, superposition, entanglement and interference of quantum states employed in quantum computing are generally considered as resources for this speed up. Furthermore, quantum computers can perform simulations of another quantum system such as protein folding that is beyond the capability of any classical computers.

IBM recently released the Quantum Experience to allow users to access a five qubit quantum processor, with services including circuit design, simulation, testing and actual computation on a physical device [

It is true that a quantum computer using quantum laws can process many quantum states simultaneously via quantum parallelism, say it can compute f ( x ) through evaluating the function for multiple values of x simultaneously. But when we need to measure the results of the computation, we can only get one result and sometimes this result is even nondeterministic. Therefore, to get a classical result, we have to run a quantum algorithm multiple times in some cases. In this sense, to produce the same classical result, an algorithm that requires less number of runs the better. In particularly, for a quantum classifier, we are not only interested in the classification accuracy, but also less number of runs (or shots in IBM’s term) of the algorithm.

Machine learning is generally categorized into three classes: supervised learning, unsupervised learning, and reinforcement learning. In supervised learning, there are classification and regression problems. Quantum advantage has long been known for problems such as factoring, search, and simulation. Only recently have quantum algorithms begun to emerge that demonstrate their quantum speedups in machine learning. These quantum machine learning algorithms have to address the basic techniques such as how to compute the distance among the data points and inner product of two vectors. These computing tasks are easy for classical computes because they can measure the computed terms anytime during the whole process, but quantum computers usually can only take direct measurements at the end of the process. It is no easy task for a quantum algorithm to be creative in exploiting measurement in a non-trivial way during the computing process.

The work in [

This classifier was tested on iris dataset and a circles data set. The numerical results were summarized in two tables [

This paper reports our empirical analysis of a quantum classifier implemented on IBM’s 5Q computer. We will briefly present the distance based quantum classifier created in [

Let N = 2 n and X = ( x 0 , x 1 ⋯ , x N − 1 ) ∈ R N is a unit vector. A dataset of size M can be denoted by D = { ( X 0 , y 0 ) , ( X 1 , y 1 ) , ( X 2 , y 2 ) , ⋯ , ( X M − 1 , y M − 1 ) } , where X m ∈ R N represents a data point and y m represents its class label. For a binary classification, y m ∈ { 0 , 1 } . The amplitude encoding of X is | ψ X 〉 = ∑ i = 0 N − 1 x i | i 〉 . The first step of the quantum classifier is, through some quantum operations (see the quantum circuit in

| D 〉 = 1 Z 1 ∑ m = 0 M − 1 | m 〉 ( | 0 〉 | ψ X ˜ 〉 + | 1 〉 | ψ X m 〉 ) | y m 〉 (1)

where X ˜ is a test point, X m are training points, m is an index for the data point X m , and Z 1 is a normalization constant to make the whole quantum state of length one. After applying the Hadamard gate to the data qubit that contains X ˜ and X m , the state becomes

| D 1 〉 = 1 Z 1 ∑ m = 0 M − 1 | m 〉 ( | 0 〉 | ψ X ˜ + X m 〉 + | 1 〉 | ψ X ˜ − X m 〉 ) | y m 〉 (2)

here ψ X ˜ ± X m = | ψ X ˜ 〉 ± | ψ X m 〉 . The key observation is

1 Z 2 ∑ y m = 0 | X ˜ + X m | 2 = 1 − 1 Z 2 ∑ y m = 0 | X ˜ − X m | 2

where Z 2 is another normalization constant, which implies that the closer the test point X ˜ to the training point X m , the larger the amplitudes in front of y m and therefore the more likely to be observed at the last step of measurements. In another word, the probability of measuring the class qubit y m in state | 0 〉 is:

P ( y ˜ = 0 ) = 1 Z 2 ∑ y m = 0 | X ˜ + X m | 2 = 1 − 1 Z 2 ∑ y m = 0 | X ˜ − X m | 2 (3)

By exploiting superposition, entanglement, and interference, this algorithm is able to evaluate the distance of the test point to all the training points at once. And more importantly, through clever mathematical manipulation, the probability of the test point belonging to class 0 or 1 can be determined at the end of the quantum circuit. However, when measuring the ancilla qubit (the qubit containing | 0 〉 and | 1 〉 in equations (1) and (2)), we may see | 1 〉 instead of | 0 〉 . Therefore the final tally of the counts for P(0) or P(1) has to exclude the results of seeing | 1 〉 in the ancilla qubit, which is way different from classical machine learning algorithms.

In this circuit in

A typical quantum circuit is composed of initial states, followed by several unitary operators, and terminated by a few measurements according to the task of the algorithm. During execution of the algorithm, the super positioned quantum states have to maintain their coherent superposition. As a result, the length of a quantum circuit depends directly on the amount of time that these quantum states can remain coherent [

The quantum classifier in [

Our first task is to test the binary classifier in [

In the next two experiments, we choose θ = π/4 and θ = π/16. As expected when the distance of the two training points becomes smaller, the binary classification task gets harder which can been seen from the values of P(0) and P(1) in

As the test point moves from 0 (class label 0) to π/4 (class label 1), P(0) changes from about 0.54 to about 0.56 gradually. Similarly, P(1) changes from 0.46 to 0.56. When the test point gets into the middle of the interval [0, π/4], the classification problem becomes harder for a distance based classifier. Note that P(0) and P(1) do not take values near 1 and 0 as the distances between 0 and π/4 is smaller than that of 0 and π, compared with

In this plot, the distance between 0 and π/16 is even smaller than that of 0 and π/4, in comparison to

quantum computing caused by the genuinely probabilistic properties of quantum states.

The IBM 5Q consists of five superconducting transmon qubits patterned on a silicon substrate. The qubits are labeled Q0, Q1, Q2, Q3, and Q4. For the IBM 5Q system, only 4 different types of CNOT gates are available: every CNOT must have Q2 as the target qubit and Q0, Q1, Q3, or Q4 as the control qubit, see

To test the effects of the swap operation as shown in

To make a multiclass classifier, four classes to be exact in this study, we use two qubits for index and two qubits for classes following the circuit patterns shown in

Quantum machine learning, though in its initial stage, has demonstrated its

potential to speed up some of the costly machine learning calculations when compared to the existing classical approaches. This study presents an empirical analysis of the distance based quantum classifier created in [

differentiate the training points. The values of P(0) and P(1) are very close to 0.5 and oscillating about 0.5. Because of the stochastic nature of quantum algorithms, this phenomena makes prediction more difficult than the corresponding classical algorithms.

The second is to see if the swap operation that is commonly utilized in quantum circuit design to meet the hardware constraints may have an impact on the performance of this classifier. A typical circuit of a quantum machine learning algorithm employs superposition, entailment, and interference of qubits. Because of this, any swap of two qubits in the circuit may influence the final outcome of the algorithm. Our experiments suggest that the swap operation does not make a difference in the cases that we have tested.

Our final goal is to extend the binary classifier in [

It is known that some quantum machine learning algorithms can outperform their classical counterparts, but the full scope of their potential needs to be further investigated. Our work adds new knowledge to this field by helping learn the working of this quantum classifier, fulfilling part of the big goal of understanding the value of quantum computing to machine learning and AI.

Hu, W. (2018) Empirical Analysis of a Quantum Classifier Implemented on IBM’s 5Q Quantum Computer. Journal of Quantum Information Science, 8, 1-11. https://doi.org/10.4236/jqis.2018.81001