Page 71 - Kaleidoscope Academic Conference Proceedings 2024
P. 71
Innovation and Digital Transformation for a Sustainable World
and the computational resources available to the analyst, 3. QUANTUM COMPUTING
although the newly developed improvements further refine
the existing classification techniques and their effectiveness. Quantum computing [17] represents a rapidly evolving
The classification techniques of ML are depicted in Figure 3. domain harnessing the principles of quantum mechanics
to execute computational tasks. In contrast to classical
2.1 Logistic Regression computers reliant on binary bits (0s and 1s), quantum
computers utilize quantum bits, or qubits. Qubits possess
LR [14] as a tool for binary classification is a statistical the unique ability to occupy multiple states simultaneously
method. It approximates the probability that an input belongs through superposition and entanglement phenomena, thereby
to a categorical class using the logistic function, which is empowering quantum computers to handle extensive data
essentially a mapping between input features with the domain volumes and execute specific calculations with remarkable
(0, 1). The model learns the coefficients to yield the best efficiency compared to classical counterparts. To define the
fit, hence it can establish the relationship between input and concept of quantum computing mathematically, Let H denote
output given the data set. In terms of mathematics, the the Hilbert space associated with the quantum computing
probability of this prediction ( (x)) is given by system. A quantum computer operates by manipulating
1 qubits, which are represented as vectors in H. Each qubit can
(x) = (1) be in a superposition of basis states, denoted by |0⟩ and |1⟩,
− x
1 +
where |0⟩ represents the state corresponding to the logical
where are coefficients, and x is the input vector. These
value 0, and |1⟩ represents the state corresponding to the
coefficients are then optimized using methods such as
logical value 1.
Maximum Likelihood Estimation, and the model is trained to
accurately predict the output based on input variables.
3.1 Superpostion
2.2 Support Vector Machine Quantum states can be represented as linear combinations
of basis states, allowing qubits to exist in a superposition of
The SVM [15] is a supervised machine learning algorithm
states | ⟩ = |0⟩ + |1⟩. Mathematically, for a single qubit
designed to classify cases into two distinct classes. In
| ⟩, superposition is expressed as:
binary classification, we have a dataset of feature vectors x
and corresponding targets together with labels, and SVM | ⟩ = |0⟩ + |1⟩ (4)
aims to find the hyperplane represented by w and which
maximizes the margin between the classes. Here, let’s denote where and are complex probability amplitudes satisfying
2
2
w as the weight vector, for the bias, and x the input feature | | + | | = 1, enabling the representation of both 0 and 1
vector, formulated as - simultaneously.
1 2
min ∥w∥ (2) 3.2 Entanglement
w, 2
that are to satisfy (w · x + ) ≥ 1 for all , and its decision Entanglement means that when two qubits are
function can be written as - correlated,regardless of their physical separation,one
qubit’s state depends on the other’s state. Mathematically,
(x) = |w · x + | (3)
for two qubits | ⟩ an entangled state can be represented as:
which, if (x) ≥ 0, is the prediction of the class label for 1
input x. √ (|00⟩ + |11⟩) (5)
2
2.3 Principal Component Analysis
3.3 Quantum Gates
PCA [16] is a method for dimensionality reduction
Quantum gates are unitary operators that manipulate qubits
and for visualising data, transforming original variables
to perform specific operations. Analogous to classical logic
into orthogonal vectors called principal components, and
gates, quantum gates serve as the building blocks of quantum
maximizing data variance. Given an × data matrix
algorithms. Mathematically, a quantum gate operates on a
X, PCA computes eigenvectors and eigenvalues of its
′
qubit | ⟩ as | ⟩ = | ⟩.
covariance matrix. The first principal component, PC 1 , is
the linear combination of variables maximizing variance,
4. QUANTUM MACHINE LEARNING
with subsequent components PC 2 , PC 3 , . . . orthogonal to
preceding ones, capturing remaining variance. PCA’s
QML is a widening field of study where quantum computing
essence lies in expressing data in terms of these components,
unites with machine learning. In a nutshell, QML puts
effectively reducing dimensionality while preserving the most
the properties/concepts of quantum mechanics into use to
significant information. Mathematically, PCA computes
design and invent new machine-learning algorithms with
X pca = XV, where X pca contains principal component scores,
associated techniques. The classification techniques of QML
V comprises eigenvectors, and X represents original data.
are depicted in Figure 4.
.
– 27 –