Human-centric Computing and Information Sciences volume 12, Article number: 30 (2022)
Cite this article 1 Accesses
https://doi.org/10.22967/HCIS.2022.12.030
Data clustering is one of the challenges of machine learning problems that group a set of data objects into a subset of a predefined number of groups. This paper proposes a new, improved version of the reptile search algorithm (RSA) called quantum mutation reptile search algorithm (QMRSA). The proposed method uses the quantum mutation-based search strategy to enhance the performance of the RSA to solve various optimization problems. The method tackles the main shortcomings raised in the original version of the RSA, like premature convergence and non-equilibrium between the search processes. Experiments are conducted on several benchmark functions and data clustering problems. The results are analyzed and compared with several stateof-the-art methods, including aquila optimizer, grey wolf optimizer, sine cosine algorithm, whale optimization algorithm, dragonfly algorithm, and arithmetic optimization algorithm. The results show the QMRSA’s superiority in dealing with the mathematical benchmark functions and real-world problems like data clustering
Quantum Mutation, Reptile Search Algorithm (RSA), Global Optimization, Data Clustering, Algorithm
Optimization algorithms have been proposed based on the natural behaviors of various living organisms [1, 2]. In other words, the main features of optimization algorithms are that they are natureinspired and they aim to add new methods and techniques for solving optimization problems in various fields [3]. Particle swarm optimization (PSO) [4], arithmetic optimization algorithm [5], marine predators algorithm [6], red deer algorithm [7], aquila optimizer [8, 9], grey wolf optimizer [10], ant lion optimizer [11], whale optimization algorithm [12], salp swarm algorithm [13], moth flame optimization (MFO) [14] and reptile search algorithm (RSA) [15] are examples of the standard optimization algorithms that belong to the swarm-based algorithms branch. Optimization algorithms are usually inspired by social insect colonies and animal societies [16]. Also, they emulate the behavior of animals looking for food or survival. The main features of these methods are their robustness in achieving the optimal solutions and their flexibility in to be adapted to many problems [17]. RSA is one of the recent meta-heuristics which belongs to population-based algorithms, and it has been proposed by Abualigah et al. [15]. RSA imitates the behavior of crocodiles on two key activities that incorporate encircling and hunting. the former is performed through high walking or belly walking, while the latter is achieved through hunting coordination or hunting cooperation. In contrast, Charles Darwin’s theory of evolution is the only idea for much recent biology which was proposed in [18]. In other words, the theory's selection concept is based on random mutations, where the valuable mutations’ selection aid in determining the path of evolution. However, the mutation's random selection has a weakness in determining the guided mutation called adaptive mutation [19]. Therefore, many research works were performed to enhance the process of selection and mutation. Han and Kim used the advantages of quantum computing to introduce a new version of the evolutionary algorithm, namely a quantum-inspired evolutionary algorithm (QEA) [20]. QEA depends on the theory and mechanism of quantum computing, like matching status and a quantum bit (Q-bit). Taking the advantages of quantum mutation (QM), in the present work, we propose an improvement of RSA named quantum mutation RSA (QMRSA). Shortcomings such as loss of solution’s accuracy, the premature convergence problem, and the balance between exploration and exploitation in the search processes have been solved. Furthermore, QMRSA improves the diversity of the population and the exploration ability of RSA and increases the chance of selecting the best solutions. Evaluating the performance of the QMRSA will be performed by comparing it with other published methods in the literature using a set of benchmark functions and data clustering problems. The main contribution of this study can be summarized as follows:
Proposing an alternative data clustering method that depends on improved RSA performance.
QM is used to enhance the searching ability of RSA to find the optimal solution.
Assess the efficiency of the developed method using a set of global optimization problems.
Evaluate the ability of the developed method as a clustering method using different datasets.
This section briefly reviews the existing quantum techniques used to improve meta-heuristics (MH)
performance [21, 22]. For example, Kaveh et al.
[23] proposed a hybrid algorithm that integrates QEA and
colliding bodies for optimization, called QECBO. The main goal of the QECBO is to avoid traps into the
local optimum using the principles of ambiguity in classical mechanics. That means that QECBO depends
on the behavior of the quantum in the possible delta based on the enhanced Newtonian collision laws.
The author evaluated the QECBO’s performance using a set of benchmark functions. The result showed
that QECBO achieved high-quality solutions and balanced exploitation and exploration. Moreover, many
studies highlighted the use of classical genetic algorithms to develop quantum computing; the summary
of these studies has been introduced in [24]. However, the purpose was to rely on classical computers.
Therefore, it has become necessary to focus on the significance of improving the classical mechanisms.
Malossini et al. [25] proposed an algorithm called quantum genetic optimization algorithm (QGOA). The
authors took advantage of the quantum method (i.e., its ability to evaluate the solutions’ fitness and the
selection processes). Also, they improved the efficiency of QGOA by conducting several comparisons
based on the computational complexity. The experiments showed that QGOA outperformed classical
genetic algorithms.
Liu et al. [26] used the advantages of the mutation technique to propose quantum-behaved particle swarm
optimization, called QPSO. The proposed algorithm aims to enhance the weak global search mechanism
of the original PSO, which avoids trapping in the local minima. The authors employed a set of benchmark
functions to evaluate the performance of QPSO. The result illustrated that the QPSO outperformed basic
PSO. In [27], the authors enhanced the stability of the backtracking search algorithm (BSA) using a set
of processes, such as quantum Gaussian mutations and quasi reflection depending on initialization. The
proposed algorithm called gQR-BSA aims to modify the arranged frame of the basic BSA using the
quantum Gaussian mutation to balance exploitation and exploration in the search process and determine
the optimal solutions. Thus, gQR-BSA is able to solve various difficult problems. A set of benchmark
functions and engineering optimization problems have been used to evaluate the efficiency of gQR-BSA.
The experiment results showed that gQR-BSA outperformed the basic BSA and other its variants.
Jiao et al. [28] integrated the basics and technique of quantum computing to enhance the immune clonal
algorithm’s convergence rate, called QICA. In the proposed algorithm, the antibody is produced and split
into groups of sub-populations that were represented as gene quantum bits. The turnover gate mechanism
of quantum mutation is used to increase the convergence rate. A practical problem and a set of benchmark
functions have been utilized to demonstrate the efficiency of QICA in terms of the search mechanism.
The results showed that QICA achieved better performance than other enhanced genetic algorithms in
the literature. The enhancement of quantum genetic algorithm has been introduced in [29] to increase the
chance of selecting the best solution in the search space. The strategy of the proposed algorithm is based
on quantum mutation and catastrophe processes with locating the self-adaptive and rotating angle. The
authors used function optimization problems to evaluate the efficiency and ability of the proposed algorithm
in the search space. Xu et al. [30] introduced a mutation acceleration coefficient of quantum with PSO
algorithm for determining the hyperspectral endmember, namely MOAQPSO. The main contribution of
the MOAQPSO is improving the search mechanism of the basic PSO by avoiding being stuck in the local
optimum. Also, to increase the diversity of the solutions in the search space. Real hyperspectral and synthetic
datasets were used to evaluate the performance of the MOAQPSO. The results illustrated that the
MOAQPSO outperformed the other state-of-the-art algorithms. In [31], a quantum crow search optimization
algorithm was proposed and applied to solve the image segmentation problem. In this technique, the initial
population is generated using the quantum state concept. Then, the rotation operation is used to update
the current population. This algorithm has been compared with other cluster-based image segmentation
techniques, and it has demonstrated its performance compared to the benchmark techniques.
In this section, the formulation of clustering problem, RSA and QM technique are introduced.
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
We discuss in this section the developed QMRSA.
(16)
(17)
No. | Algorithm | Parameter | Value |
1 | Aquila optimizer (AO) | α | 0.1 |
δ | 0.1 | ||
2 | Grey wolf optimizer (GWO) | Convergence parameter (α) | Linear reduction from 2 to 0 |
3 | Sine cosine algorithm (SCA) | α | 0.05 |
4 | Whale optimization algorithm (WOA) | α | Decreased from 2 to 0 |
5 | Dragonfly algorithm (DA) | w | 0.2–0.9 |
s, a, and c | 0.1 | ||
f and e | 1 | ||
6 | Flow direction algorithm (FDA) | β | 3 |
7 | Arithmetic optimization algorithm (AOA) | α | 5 |
μ | 0.5 | ||
8 | Reptile search algorithm (RSA) | α | 0.1 |
β | 0.1 | ||
9 | Quantum mutation RSA (QMRSA) | α | 0.1 |
β | 0.1 | ||
β2 | 0.2 |
Fun | Measure | Comparative algorithms | ||||||||
AO | GWO | SCA | WOA | DA | FDA | AOA | RSA | QMRSA | ||
F1 | Best | 5.89E-120 | 5.19E-47 | 1.41E-09 | 3.31E-69 | 2.15E+01 | 3.23E-11 | 0.00E+00 | 0.00E+00 | 0.00E+00 |
Average | 1.47E-120 | 1.66E-47 | 3.98E-10 | 8.28E-70 | 7.61E+00 | 1.53E-11 | 0.00E+00 | 0.00E+00 | 0.00E+00 | |
Worst | 7.14E-150 | 2.72E-49 | 2.70E-14 | 6.43E-78 | 1.44E-01 | 6.20E-13 | 0.00E+00 | 0.00E+00 | 0.00E+00 | |
STD | 2.95E-120 | 2.43E-47 | 6.81E-10 | 1.66E-69 | 9.96E+00 | 1.59E-11 | 0.00E+00 | 0.00E+00 | 0.00E+00 | |
p-value | 3.56E-01 | 2.20E-01 | 2.86E-01 | 3.56E-01 | 1.77E-01 | 1.03E-01 | NaN | NaN | NaN | |
Rank | 4 | 6 | 8 | 5 | 9 | 7 | 1 | 1 | 1 | |
F2 | Best | 4.22E-58 | 7.88E-27 | 5.31E-09 | 7.34E-49 | 1.95E+00 | 1.27E-08 | 0.00E+00 | 0.00E+00 | 0.00E+00 |
Average | 1.11E-58 | 2.65E-27 | 2.24E-09 | 1.84E-49 | 1.70E+00 | 5.55E-09 | 0.00E+00 | 0.00E+00 | 0.00E+00 | |
Worst | 3.07E-72 | 1.59E-28 | 6.09E-11 | 1.03E-53 | 1.52E+00 | 1.37E-09 | 0.00E+00 | 0.00E+00 | 0.00E+00 | |
STD | 2.07E-58 | 3.55E-27 | 2.54E-09 | 3.67E-49 | 1.81E-01 | 4.95E-09 | 0.00E+00 | 0.00E+00 | 0.00E+00 | |
p-value | 3.26E-01 | 1.87E-01 | 1.28E-01 | 3.55E-01 | 1.47E-06 | 6.62E-02 | NaN | NaN | NaN | |
Rank | 4 | 6 | 7 | 5 | 9 | 8 | 1 | 1 | 1 | |
F3 | Best | 1.31E-110 | 1.05E-18 | 1.86E+00 | 9.42E+02 | 5.15E+02 | 2.38E-02 | 0.00E+00 | 0.00E+00 | 0.00E+00 |
Average | 3.29E-111 | 2.85E-19 | 4.72E-01 | 4.41E+02 | 2.12E+02 | 8.57E-03 | 0.00E+00 | 0.00E+00 | 0.00E+00 | |
Worst | 2.69E-153 | 1.50E-24 | 1.48E-04 | 8.99E+01 | 2.77E+00 | 3.31E-03 | 0.00E+00 | 0.00E+00 | 0.00E+00 | |
STD | 6.57E-111 | 5.12E-19 | 9.28E-01 | 4.14E+02 | 2.16E+02 | 1.02E-02 | 0.00E+00 | 0.00E+00 | 0.00E+00 | |
p-value | 3.56E-01 | 3.09E-01 | 3.48E-01 | 7.70E-02 | 9.74E-02 | 1.44E-01 | NaN | NaN | NaN | |
Rank | 4 | 5 | 7 | 9 | 8 | 6 | 1 | 1 | 1 | |
F4 | Best | 5.73E-71 | 1.30E-14 | 2.13E-02 | 4.86E+01 | 4.37E+00 | 1.69E+00 | 2.63E-119 | 0.00E+00 | 0.00E+00 |
Average | 1.46E-71 | 6.37E-15 | 5.81E-03 | 2.32E+01 | 2.83E+00 | 1.18E+00 | 6.57E-120 | 0.00E+00 | 0.00E+00 | |
Worst | 2.28E-78 | 1.95E-15 | 5.75E-04 | 4.40E-01 | 1.51E+00 | 5.94E-01 | 0.00E+00 | 0.00E+00 | 0.00E+00 | |
STD | 2.85E-71 | 5.15E-15 | 1.03E-02 | 2.11E+01 | 1.19E+00 | 4.62E-01 | 1.31E-119 | 0.00E+00 | 0.00E+00 | |
p-value | 3.45E-01 | 4.82E-02 | 3.04E-01 | 7.05E-02 | 3.09E-03 | 2.25E-03 | 3.56E-01 | NaN | NaN | |
Rank | 4 | 5 | 6 | 9 | 8 | 7 | 3 | 1 | 1 | |
F5 | Best | 1.61E-03 | 8.05E+00 | 8.08E+00 | 8.93E+00 | 1.88E+03 | 6.45E+00 | 7.42E+00 | 8.61E+00 | 3.36E-29 |
Average | 8.82E-04 | 6.83E+00 | 7.54E+00 | 7.74E+00 | 5.69E+02 | 5.64E+00 | 7.23E+00 | 8.17E+00 | 2.80E-29 | |
Worst | 3.13E-04 | 5.86E+00 | 7.28E+00 | 7.20E+00 | 2.20E+01 | 5.13E+00 | 6.95E+00 | 7.38E+00 | 1.87E-29 | |
STD | 5.38E-04 | 9.88E-01 | 3.71E-01 | 8.10E-01 | 8.83E+02 | 6.35E-01 | 2.02E-01 | 5.43E-01 | 6.76E-30 | |
p-value | 8.90E-08 | 5.42E-02 | 1.00E-01 | 4.04E-01 | 2.51E-01 | 9.13E-04 | 1.75E-02 | 8.89E-08 | NaN | |
Rank | 2 | 4 | 6 | 7 | 9 | 3 | 5 | 8 | 1 | |
F6 | Best | 1.41E-04 | 5.77E-06 | 5.67E-01 | 1.23E-01 | 3.01E+00 | 1.58E+00 | 8.71E-02 | 2.25E+00 | 7.64E-11 |
Average | 8.03E-05 | 4.63E-06 | 4.19E-01 | 3.84E-02 | 2.09E+00 | 1.42E+00 | 4.51E-02 | 2.13E+00 | 2.29E-11 | |
Worst | 2.44E-05 | 2.59E-06 | 2.58E-01 | 6.35E-03 | 1.03E-02 | 1.33E+00 | 7.21E-03 | 1.93E+00 | 6.09E-13 | |
STD | 6.07E-05 | 1.40E-06 | 1.36E-01 | 5.63E-02 | 1.42E+00 | 1.11E-01 | 3.54E-02 | 1.56E-01 | 3.60E-11 | |
p-value | 2.39E-07 | 2.39E-07 | 2.80E-05 | 5.56E-07 | 3.82E-01 | 2.39E-07 | 3.86E-07 | 3.11E-04 | NaN | |
Rank | 3 | 2 | 6 | 4 | 8 | 7 | 5 | 9 | 1 |
Fun | Measure | Comparative algorithms | ||||||||
AO | GWO | SCA | WOA | DA | FDA | AOA | RSA | QMRSA | ||
F7 | Best | 1.43E-03 | 1.14E-03 | 3.08E-03 | 1.91E-03 | 3.84E-02 | 2.18E-02 | 1.99E-04 | 2.86E-04 | 2.26E-04 |
Average | 5.36E-04 | 7.39E-04 | 2.20E-03 | 1.09E-03 | 2.29E-02 | 8.78E-03 | 1.46E-04 | 2.00E-04 | 1.28E-04 | |
Worst | 9.87E-05 | 4.25E-04 | 1.33E-03 | 2.38E-04 | 9.03E-03 | 4.07E-03 | 2.97E-05 | 7.95E-05 | 1.43E-05 | |
STD | 6.04E-04 | 3.16E-04 | 9.99E-04 | 6.99E-04 | 1.31E-02 | 8.68E-03 | 7.94E-05 | 9.37E-05 | 8.77E-05 | |
p-value | 3.13E-01 | 1.70E-02 | 7.26E-03 | 4.42E-02 | 1.34E-02 | 9.52E-02 | 4.19E-01 | 3.04E-01 | NaN | |
Rank | 4 | 5 | 7 | 6 | 9 | 8 | 2 | 3 | 1 | |
F8 | Best | -1.92E+03 | -2.14E+03 | -1.75E+03 | -2.81E+03 | -2.38E+03 | -2.76E+03 | -2.41E+03 | -1.91E+03 | -1.81E+03 |
Average | -2.47E+03 | -2.69E+03 | -2.02E+03 | -3.55E+03 | -2.76E+03 | -3.08E+03 | -2.71E+03 | -1.94E+03 | -2.04E+03 | |
Worst | -2.95E+03 | -3.00E+03 | -2.20E+03 | -4.19E+03 | -3.11E+03 | -3.36E+03 | -3.12E+03 | -1.99E+03 | -2.23E+03 | |
STD | 4.23E+02 | 3.80E+02 | 1.95E+02 | 7.43E+02 | 2.98E+02 | 2.51E+02 | 3.08E+02 | 3.55E+01 | 1.95E+02 | |
p-value | 1.13E-01 | 2.31E-02 | 9.02E-01 | 7.76E-03 | 6.75E-03 | 5.96E-04 | 1.07E-02 | 3.28E-01 | 1.00E+00 | |
Rank | 6 | 5 | 8 | 1 | 3 | 2 | 4 | 9 | 7 | |
F9 | Best | 0.00E+00 | 9.95E+00 | 4.15E+00 | 2.58E+01 | 1.32E+01 | 1.59E+01 | 0.00E+00 | 0.00E+00 | 0.00E+00 |
Average | 0.00E+00 | 2.49E+00 | 1.04E+00 | 6.45E+00 | 1.06E+01 | 9.46E+00 | 0.00E+00 | 0.00E+00 | 0.00E+00 | |
Worst | 0.00E+00 | 0.00E+00 | 5.97E-13 | 0.00E+00 | 8.80E+00 | 3.98E+00 | 0.00E+00 | 0.00E+00 | 0.00E+00 | |
STD | 0.00E+00 | 4.98E+00 | 2.07E+00 | 1.29E+01 | 1.96E+00 | 5.18E+00 | 0.00E+00 | 0.00E+00 | 0.00E+00 | |
p-value | NaN | 3.56E-01 | 3.54E-01 | 3.56E-01 | 3.74E-05 | 1.07E-02 | NaN | NaN | NaN | |
Rank | 1 | 6 | 5 | 7 | 9 | 8 | 1 | 1 | 1 | |
F10 | Best | 8.88E-16 | 1.51E-14 | 5.11E-06 | 4.44E-15 | 1.99E+01 | 2.01E+00 | 8.88E-16 | 8.88E-16 | 8.88E-16 |
Average | 8.88E-16 | 9.77E-15 | 1.87E-06 | 1.78E-15 | 8.15E+00 | 1.33E+00 | 8.88E-16 | 8.88E-16 | 8.88E-16 | |
Worst | 8.88E-16 | 7.99E-15 | 2.36E-07 | 8.88E-16 | 2.10E+00 | 2.60E-06 | 8.88E-16 | 8.88E-16 | 8.88E-16 | |
STD | 0.00E+00 | 3.55E-15 | 2.22E-06 | 1.78E-15 | 7.99E+00 | 9.01E-01 | 0.00E+00 | 0.00E+00 | 0.00E+00 | |
p-value | NaN | 2.45E-03 | 1.44E-01 | 3.56E-01 | 8.77E-02 | 2.58E-02 | NaN | NaN | NaN | |
Rank | 1 | 6 | 7 | 5 | 9 | 8 | 1 | 1 | 1 | |
F11 | Best | 0.00E+00 | 1.69E-02 | 6.86E-01 | 3.99E-01 | 1.58E+00 | 4.92E-01 | 0.00E+00 | 0.00E+00 | 0.00E+00 |
Average | 0.00E+00 | 7.55E-03 | 4.39E-01 | 1.40E-01 | 8.93E-01 | 3.17E-01 | 0.00E+00 | 0.00E+00 | 0.00E+00 | |
Worst | 0.00E+00 | 0.00E+00 | 1.42E-04 | 0.00E+00 | 5.17E-01 | 1.87E-01 | 0.00E+00 | 0.00E+00 | 0.00E+00 | |
STD | 0.00E+00 | 8.84E-03 | 3.15E-01 | 1.89E-01 | 4.74E-01 | 1.40E-01 | 0.00E+00 | 0.00E+00 | 0.00E+00 | |
p-value | NaN | 1.38E-01 | 3.20E-02 | 1.90E-01 | 9.33E-03 | 4.06E-03 | NaN | NaN | NaN | |
Rank | 1 | 5 | 8 | 6 | 9 | 7 | 1 | 1 | 1 | |
F12 | Best | 2.01E-05 | 3.97E-02 | 1.55E-01 | 1.44E-01 | 2.30E+00 | 1.24E+00 | 3.89E-01 | 7.23E-01 | 6.09E-02 |
Average | 5.33E-06 | 1.49E-02 | 1.28E-01 | 5.09E-02 | 1.15E+00 | 6.22E-01 | 2.78E-01 | 6.07E-01 | 5.63E-02 | |
Worst | 7.60E-08 | 5.43E-06 | 9.17E-02 | 7.58E-03 | 2.07E-01 | 9.48E-09 | 1.97E-01 | 5.16E-01 | 5.00E-02 | |
STD | 9.82E-06 | 1.90E-02 | 2.71E-02 | 6.39E-02 | 8.79E-01 | 5.68E-01 | 8.70E-02 | 9.08E-02 | 5.49E-03 | |
p-value | 6.88E-04 | 1.04E-03 | 1.66E-02 | 5.60E-03 | 9.55E-02 | 2.77E-01 | 2.24E-03 | 1.96E-03 | NaN | |
Rank | 1 | 2 | 5 | 3 | 9 | 8 | 6 | 7 | 4 | |
F13 | Best | 1.79E-05 | 1.02E-01 | 4.49E-01 | 1.62E-01 | 4.00E+00 | 9.92E-10 | 8.84E-01 | 8.27E-28 | 4.85E-31 |
Average | 9.39E-06 | 5.02E-02 | 3.46E-01 | 8.22E-02 | 1.95E+00 | 2.87E-10 | 7.41E-01 | 2.93E-28 | 2.55E-31 | |
Worst | 9.75E-10 | 2.98E-06 | 2.18E-01 | 1.52E-02 | 7.74E-01 | 3.02E-11 | 4.83E-01 | 4.60E-31 | 2.46E-32 | |
STD | 7.64E-06 | 5.80E-02 | 1.07E-01 | 6.12E-02 | 1.41E+00 | 4.70E-10 | 1.79E-01 | 3.89E-28 | 2.61E-31 | |
p-value | 4.93E-02 | 1.34E-01 | 6.59E-04 | 3.64E-02 | 3.24E-02 | 2.68E-01 | 1.67E-04 | 1.83E-01 | NaN | |
Rank | 4 | 5 | 7 | 6 | 9 | 3 | 8 | 2 | 1 | |
Mean ranking | 3 | 4.769 | 6.692 | 5.615 | 8.308 | 6.308 | 3 | 3.462 | 1.692 | |
Final ranking | 2 | 5 | 8 | 6 | 9 | 7 | 2 | 4 | 1 |
Fun | Measure | Comparative algorithms | ||||||||
AO | GWO | SCA | WOA | DA | FDA | AOA | RSA | QMRSA | ||
F14 | Best | 2.98E+00 | 1.08E+01 | 1.01E+00 | 1.08E+01 | 1.99E+00 | 6.83E+00 | 1.27E+01 | 2.98E+00 | 9.98E-01 |
Average | 1.99E+00 | 4.18E+00 | 1.00E+00 | 3.94E+00 | 1.25E+00 | 3.75E+00 | 9.51E+00 | 2.79E+00 | 9.98E-01 | |
Worst | 9.98E-01 | 9.98E-01 | 9.99E-01 | 9.98E-01 | 9.98E-01 | 2.21E+00 | 9.98E-01 | 2.20E+00 | 9.98E-01 | |
STD | 1.15E+00 | 4.46E+00 | 6.64E-03 | 4.65E+00 | 4.97E-01 | 2.08E+00 | 5.70E+00 | 3.90E-01 | 1.34E-13 | |
p-value | 1.89E-01 | 8.66E-01 | 3.87E-02 | 9.44E-01 | 5.80E-02 | 3.84E-02 | 1.06E-01 | 3.99E-01 | NaN | |
Rank | 4 | 8 | 2 | 7 | 3 | 6 | 9 | 5 | 1 | |
F15 | Best | 5.95E-04 | 1.57E-03 | 1.53E-03 | 7.83E-04 | 2.26E-02 | 7.68E-04 | 3.48E-03 | 1.83E-03 | 4.58E-04 |
Average | 4.68E-04 | 1.09E-03 | 1.25E-03 | 6.84E-04 | 6.27E-03 | 5.36E-04 | 1.63E-03 | 1.65E-03 | 3.86E-04 | |
Worst | 3.49E-04 | 5.56E-04 | 5.55E-04 | 4.37E-04 | 7.63E-04 | 3.07E-04 | 3.73E-04 | 1.42E-03 | 3.10E-04 | |
STD | 1.34E-04 | 4.64E-04 | 4.67E-04 | 1.65E-04 | 1.09E-02 | 2.64E-04 | 1.52E-03 | 1.79E-04 | 6.07E-05 | |
p-value | 4.12E-02 | 2.33E-02 | 6.51E-01 | 1.47E-01 | 3.77E-01 | 8.18E-02 | 5.21E-01 | 6.72E-02 | NaN | |
Rank | 2 | 5 | 6 | 4 | 9 | 3 | 7 | 8 | 1 | |
F16 | Best | -1.03E+00 | -1.03E+00 | -1.03E+00 | -1.03E+00 | -1.03E+00 | -1.03E+00 | -1.03E+00 | -1.03E+00 | -1.03E+00 |
Average | -1.03E+00 | -1.03E+00 | -1.03E+00 | -1.03E+00 | -1.03E+00 | -1.03E+00 | -1.03E+00 | -1.03E+00 | -1.03E+00 | |
Worst | -1.03E+00 | -1.03E+00 | -1.03E+00 | -1.03E+00 | -1.03E+00 | -1.03E+00 | -1.03E+00 | -1.03E+00 | -1.03E+00 | |
STD | 1.67E-03 | 7.80E-08 | 8.11E-05 | 3.20E-09 | 2.06E-04 | 9.50E-05 | 9.19E-08 | 2.54E-04 | 0.00E+00 | |
p-value | 1.82E-01 | 1.66E-02 | 1.49E-01 | 1.65E-02 | 6.57E-01 | 1.65E-02 | 1.66E-02 | 7.85E-03 | NaN | |
Rank | 9 | 3 | 5 | 2 | 6 | 7 | 4 | 8 | 1 | |
F17 | Best | 4.00E-01 | 4.00E-01 | 4.06E-01 | 3.98E-01 | 3.98E-01 | 4.12E-01 | 4.30E-01 | 6.23E-01 | 3.98E-01 |
Average | 3.99E-01 | 3.98E-01 | 4.01E-01 | 3.98E-01 | 3.98E-01 | 4.03E-01 | 4.12E-01 | 4.64E-01 | 3.98E-01 | |
Worst | 3.98E-01 | 3.98E-01 | 3.98E-01 | 3.98E-01 | 3.98E-01 | 3.98E-01 | 4.05E-01 | 4.01E-01 | 3.98E-01 | |
STD | 8.97E-04 | 9.19E-04 | 3.67E-03 | 4.40E-06 | 0.00E+00 | 6.27E-03 | 1.19E-02 | 1.07E-01 | 0.00E+00 | |
p-value | 2.28E-01 | 1.95E-01 | 6.59E-01 | 1.56E-01 | 1.56E-01 | 1.56E-01 | 2.38E-01 | 2.96E-01 | NaN | |
Rank | 5 | 4 | 6 | 3 | 1 | 7 | 8 | 9 | 1 | |
F18 | Best | 3.10E+00 | 3.00E+00 | 3.00E+00 | 3.00E+00 | 3.00E+00 | 3.00E+00 | 3.00E+01 | 3.01E+01 | 3.00E+00 |
Average | 3.04E+00 | 3.00E+00 | 3.00E+00 | 3.00E+00 | 3.00E+00 | 3.00E+00 | 9.75E+00 | 9.79E+00 | 3.00E+00 | |
Worst | 3.00E+00 | 3.00E+00 | 3.00E+00 | 3.00E+00 | 3.00E+00 | 3.00E+00 | 3.00E+00 | 3.00E+00 | 3.00E+00 | |
STD | 4.52E-02 | 8.26E-05 | 4.54E-05 | 1.49E-03 | 2.28E-04 | 2.56E-16 | 1.35E+01 | 1.36E+01 | 3.79E-08 | |
p-value | 1.59E-01 | 3.25E-01 | 2.04E-01 | 2.97E-01 | 1.04E-01 | 1.04E-01 | 3.56E-01 | 3.56E-01 | NaN | |
Rank | 7 | 4 | 3 | 6 | 5 | 1 | 8 | 9 | 2 | |
F19 | Best | -3.85E+00 | -3.86E+00 | -3.85E+00 | -3.86E+00 | -3.86E+00 | -3.83E+00 | -3.85E+00 | -3.75E+00 | -3.86E+00 |
Average | -3.85E+00 | -3.86E+00 | -3.85E+00 | -3.86E+00 | -3.86E+00 | -3.85E+00 | -3.85E+00 | -3.80E+00 | -3.86E+00 | |
Worst | -3.86E+00 | -3.86E+00 | -3.85E+00 | -3.86E+00 | -3.86E+00 | -3.85E+00 | -3.85E+00 | -3.85E+00 | -3.86E+00 | |
STD | 5.73E-03 | 3.75E-03 | 1.81E-03 | 3.05E-03 | 5.19E-05 | 1.08E-02 | 7.33E-04 | 4.50E-02 | 2.56E-16 | |
p-value | 2.23E-01 | 3.77E-02 | 2.28E-01 | 4.31E-02 | 1.89E-02 | 1.85E-02 | 2.34E-01 | 1.03E-01 | NaN | |
Rank | 5 | 3 | 6 | 4 | 2 | 8 | 7 | 9 | 1 | |
F20 | Best | -3.05E+00 | -3.16E+00 | -2.04E+00 | -3.06E+00 | -2.95E+00 | -2.81E+00 | -3.00E+00 | -1.90E+00 | -3.20E+00 |
Average | -3.13E+00 | -3.25E+00 | -2.50E+00 | -3.18E+00 | -3.17E+00 | -2.95E+00 | -3.03E+00 | -2.28E+00 | -3.29E+00 | |
Worst | -3.26E+00 | -3.32E+00 | -2.97E+00 | -3.32E+00 | -3.32E+00 | -3.07E+00 | -3.05E+00 | -2.71E+00 | -3.32E+00 | |
STD | 9.45E-02 | 8.30E-02 | 4.22E-01 | 1.11E-01 | 1.55E-01 | 1.08E-01 | 2.67E-02 | 3.64E-01 | 5.94E-02 | |
p-value | 5.04E-02 | 4.46E-03 | 8.13E-02 | 2.60E-02 | 6.04E-02 | 1.44E-03 | 2.01E-01 | 1.26E-02 | NaN | |
Rank | 5 | 2 | 8 | 3 | 4 | 7 | 6 | 9 | 1 |
Fun | Measure | Comparative algorithms | ||||||||
AO | GWO | SCA | WOA | DA | FDA | AOA | RSA | QMRSA | ||
F21 | Best | -1.01E+01 | -2.63E+00 | -8.79E-01 | -5.06E+00 | -2.62E+00 | -4.59E+00 | -1.61E+00 | -5.06E+00 | -5.05E+00 |
Average | -1.02E+01 | -5.73E+00 | -3.08E+00 | -5.06E+00 | -5.13E+00 | -7.50E+00 | -2.27E+00 | -5.06E+00 | -8.86E+00 | |
Worst | -1.02E+01 | -1.02E+01 | -4.91E+00 | -5.06E+00 | -1.02E+01 | -1.02E+01 | -3.03E+00 | -5.06E+00 | -1.01E+01 | |
STD | 1.56E-03 | 3.16E+00 | 1.99E+00 | 1.33E-07 | 3.55E+00 | 3.07E+00 | 7.39E-01 | 1.69E-07 | 2.53E+00 | |
p-value | 8.72E-22 | 6.83E-01 | 9.44E-02 | 2.40E-02 | 9.69E-01 | 1.63E-01 | 2.85E-04 | 8.15E-01 | NaN | |
Rank | 1 | 4 | 8 | 7 | 5 | 3 | 9 | 6 | 2 | |
F22 | Best | -1.04E+01 | -5.09E+00 | -9.06E-01 | -3.72E+00 | -2.70E+00 | -2.77E+00 | -2.44E+00 | -5.09E+00 | -1.04E+01 |
Average | -1.04E+01 | -5.09E+00 | -3.45E+00 | -6.06E+00 | -5.80E+00 | -5.51E+00 | -3.31E+00 | -5.09E+00 | -1.04E+01 | |
Worst | -1.04E+01 | -5.09E+00 | -4.95E+00 | -1.04E+01 | -1.03E+01 | -1.04E+01 | -4.29E+00 | -5.09E+00 | -1.04E+01 | |
STD | 1.21E-02 | 1.14E-06 | 1.90E+00 | 2.95E+00 | 3.18E+00 | 3.41E+00 | 8.00E-01 | 8.35E-07 | 1.41E-03 | |
p-value | 1.46E-16 | 3.63E-22 | 1.35E-01 | 5.32E-01 | 6.72E-01 | 8.14E-01 | 4.31E-03 | 8.81E-01 | NaN | |
Rank | 2 | 6 | 8 | 3 | 4 | 5 | 9 | 7 | 1 | |
F23 | Best | -5.13E+00 | -1.05E+01 | -9.41E-01 | -2.80E+00 | -2.81E+00 | -2.87E+00 | -1.23E+00 | -5.13E+00 | -5.13E+00 |
Average | -5.13E+00 | -1.05E+01 | -3.16E+00 | -5.83E+00 | -4.01E+00 | -6.94E+00 | -3.83E+00 | -5.13E+00 | -5.13E+00 | |
Worst | -5.13E+00 | -1.05E+01 | -4.94E+00 | -1.03E+01 | -5.18E+00 | -1.05E+01 | -5.62E+00 | -5.13E+00 | -5.13E+00 | |
STD | 2.04E-07 | 1.37E-03 | 1.72E+00 | 3.16E+00 | 1.35E+00 | 4.17E+00 | 1.86E+00 | 1.95E-06 | 2.04E-07 | |
p-value | 1.15E-21 | 2.82E-22 | 6.21E-02 | 6.71E-01 | 1.48E-01 | 4.17E-01 | 2.14E-01 | 4.01E-01 | NaN | |
Rank | 5 | 1 | 9 | 3 | 7 | 2 | 8 | 4 | 5 | |
Mean ranking | 4.5 | 4 | 6.1 | 4.2 | 4.6 | 4.9 | 7.5 | 7.4 | 1.6 | |
Final ranking | 4 | 2 | 7 | 3 | 5 | 6 | 9 | 8 | 1 |
Dataset | Features no. | Instances no. | Classes no. |
Cancer | 9 | 683 | 2 |
CMC | 10 | 1473 | 3 |
Glass | 9 | 214 | 7 |
Iris | 4 | 150 | 3 |
Seeds | 7 | 210 | 3 |
Heart | 13 | 270 | 2 |
Vowels | 6 | 871 | 3 |
Water | 13 | 178 | 3 |
Metric | Comparative algorithms | |||||||||
AO | GWO | SCA | WOA | DA | FDA | AOA | RSA | QMRSA | ||
Cancer | Worst | 2.93E+03 | 2.90E+03 | 3.44E+03 | 3.34E+03 | 3.48E+03 | 7.47E+02 | 3.30E+03 | 5.24E+02 | 3.46E+02 |
Average | 2.80E+03 | 2.68E+03 | 3.09E+03 | 3.15E+03 | 3.44E+03 | 5.48E+02 | 3.25E+03 | 3.56E+02 | 1.92E+02 | |
Best | 2.67E+03 | 2.40E+03 | 2.88E+03 | 2.80E+03 | 3.39E+03 | 2.50E+02 | 3.18E+03 | 2.44E+02 | 5.84E+01 | |
STD | 1.04E+02 | 2.50E+02 | 2.07E+02 | 2.40E+02 | 3.44E+01 | 1.99E+02 | 4.97E+01 | 1.13E+02 | 1.12E+02 | |
p-value | 4.02E-10 | 6.17E-08 | 5.30E-09 | 1.14E-08 | 7.96E-12 | 9.71E-02 | 1.87E-11 | 4.98E-02 | NaN | |
Rank | 5 | 4 | 6 | 7 | 9 | 3 | 8 | 2 | 1 | |
CMC | Worst | 3.30E+02 | 3.10E+02 | 3.35E+02 | 3.34E+02 | 3.35E+02 | 9.70E+01 | 3.34E+02 | 7.61E+01 | 8.35E+01 |
Average | 3.28E+02 | 3.07E+02 | 3.33E+02 | 3.33E+02 | 3.33E+02 | 9.24E+01 | 3.33E+02 | 7.15E+01 | 7.04E+01 | |
Best | 3.25E+02 | 3.02E+02 | 3.32E+02 | 3.31E+02 | 3.30E+02 | 8.44E+01 | 3.31E+02 | 6.80E+01 | 4.96E+01 | |
STD | 2.45E+00 | 2.81E+00 | 1.34E+00 | 1.37E+00 | 1.95E+00 | 1.28E+01 | 1.09E+00 | 3.59E+00 | 4.87E+00 | |
p-value | 1.49E-13 | 3.99E-13 | 6.67E-14 | 6.89E-14 | 9.21E-14 | 7.15E-03 | 6.20E-14 | 5.75E-05 | NaN | |
Rank | 5 | 4 | 9 | 8 | 6 | 3 | 7 | 2 | 1 | |
Glass | Worst | 3.11E+01 | 3.32E+01 | 3.49E+01 | 3.46E+01 | 3.51E+01 | 4.49E+00 | 3.43E+01 | 4.22E+00 | 1.65E+00 |
Average | 3.02E+01 | 3.00E+01 | 3.47E+01 | 3.40E+01 | 3.48E+01 | 1.52E+00 | 3.37E+01 | 3.57E+00 | 1.05E+00 | |
Best | 2.84E+01 | 2.63E+01 | 3.46E+01 | 3.37E+01 | 3.44E+01 | 0.00E+00 | 3.24E+01 | 3.15E+00 | 7.14E-01 | |
STD | 1.10E+00 | 2.47E+00 | 1.34E-01 | 3.76E-01 | 2.80E-01 | 1.83E+00 | 7.62E-01 | 3.85E-01 | 4.20E-01 | |
P-value | 2.57E-11 | 1.12E-08 | 2.91E-15 | 2.51E-14 | 8.44E-15 | 4.04E-02 | 8.54E-13 | 9.26E-06 | NaN | |
Rank | 5 | 4 | 8 | 7 | 9 | 2 | 6 | 3 | 1 | |
Iris | Worst | 2.08E+01 | 1.63E+01 | 2.46E+01 | 2.43E+01 | 2.44E+01 | 6.27E+00 | 2.39E+01 | 5.18E+00 | 5.18E+00 |
Average | 1.91E+01 | 1.42E+01 | 2.37E+01 | 2.32E+01 | 2.37E+01 | 3.48E+00 | 2.37E+01 | 3.82E+00 | 3.82E+00 | |
Best | 1.79E+01 | 1.27E+01 | 2.21E+01 | 2.23E+01 | 2.28E+01 | 2.02E+00 | 2.35E+01 | 2.53E+00 | 2.53E+00 | |
STD | 1.12E+00 | 1.72E+00 | 1.04E+00 | 7.72E-01 | 7.98E-01 | 1.71E+00 | 1.55E-01 | 8.19E-01 | 9.75E-01 | |
p-value | 1.30E-08 | 2.53E-06 | 1.22E-09 | 5.12E-10 | 4.52E-10 | 7.12E-01 | 6.48E-11 | 1.01E-02 | NaN | |
Rank | 5 | 4 | 9 | 6 | 8 | 1 | 7 | 2 | 2 | |
Seeds | Worst | 4.01E+01 | 3.80E+01 | 5.00E+01 | 4.99E+01 | 5.03E+01 | 1.33E+01 | 4.89E+01 | 1.16E+01 | 4.95E+00 |
Average | 3.87E+01 | 3.62E+01 | 4.92E+01 | 4.81E+01 | 4.98E+01 | 7.71E+00 | 4.85E+01 | 9.41E+00 | 4.09E+00 | |
Best | 3.77E+01 | 3.41E+01 | 4.76E+01 | 4.66E+01 | 4.93E+01 | 4.80E+00 | 4.81E+01 | 8.27E+00 | 2.58E+00 | |
STD | 9.81E-01 | 1.79E+00 | 1.05E+00 | 1.52E+00 | 3.44E-01 | 3.74E+00 | 2.97E-01 | 9.45E-01 | 1.33E+00 | |
p-value | 1.82E-10 | 3.94E-09 | 1.92E-11 | 9.69E-11 | 3.20E-12 | 3.66E-01 | 3.90E-12 | 8.52E-05 | NaN | |
Rank | 5 | 4 | 8 | 6 | 9 | 2 | 7 | 3 | 1 | |
Statlog (Heart) | Worst | 1.37E+03 | 1.17E+03 | 1.68E+03 | 1.61E+03 | 1.63E+03 | 5.78E+01 | 1.63E+03 | 7.24E+01 | 3.97E+01 |
Average | 1.29E+03 | 9.24E+02 | 1.62E+03 | 1.58E+03 | 1.54E+03 | 2.05E+01 | 1.55E+03 | 3.68E+01 | 2.40E+01 | |
Best | 1.16E+03 | 7.55E+02 | 1.49E+03 | 1.52E+03 | 1.28E+03 | 1.25E+01 | 1.46E+01 | 1.16E+01 | 1.15E+01 | |
STD | 8.74E+01 | 1.68E+02 | 7.74E+01 | 3.54E+01 | 1.48E+02 | 2.84E+01 | 6.14E+01 | 1.57E+01 | 3.15E+01 | |
p-value | 1.55E-09 | 2.71E-06 | 1.07E-10 | 1.44E-12 | 1.84E-08 | 4.14E-01 | 3.25E-11 | 4.41E-01 | NaN | |
Rank | 5 | 4 | 9 | 8 | 6 | 1 | 7 | 3 | 2 | |
Vowels | Worst | 1.42E+02 | 1.36E+02 | 1.53E+02 | 1.52E+02 | 1.53E+02 | 1.76E+01 | 1.53E+02 | 2.30E+01 | 3.58E+01 |
Average | 1.40E+02 | 1.31E+02 | 1.53E+02 | 1.51E+02 | 1.53E+02 | 1.27E+01 | 1.52E+02 | 2.09E+01 | 3.00E+01 | |
Best | 1.36E+02 | 1.25E+02 | 1.53E+02 | 1.50E+02 | 1.52E+02 | 9.94E+00 | 1.52E+02 | 1.90E+01 | 2.07E+01 | |
STD | 2.60E+00 | 4.88E+00 | 1.40E-01 | 6.06E-01 | 3.93E-01 | 3.13E+00 | 3.08E-01 | 1.78E+00 | 6.16E+00 | |
p-value | 3.40E-10 | 2.25E-09 | 6.97E-11 | 8.12E-11 | 7.24E-11 | 5.11E-04 | 7.35E-11 | 1.34E-02 | NaN | |
Rank | 5 | 4 | 9 | 6 | 8 | 1 | 7 | 2 | 3 | |
Wine | Worst | 3.33E+03 | 2.84E+03 | 3.93E+03 | 3.85E+03 | 3.98E+03 | 9.09E+02 | 3.84E+03 | 6.76E+02 | 6.76E+02 |
Average | 3.02E+03 | 2.46E+03 | 3.83E+03 | 3.74E+03 | 3.86E+03 | 4.64E+02 | 3.79E+03 | 4.84E+02 | 4.84E+02 | |
Best | 2.73E+03 | 2.15E+03 | 3.60E+03 | 3.59E+03 | 3.65E+03 | 2.20E+02 | 3.70E+03 | 3.51E+02 | 3.51E+02 | |
STD | 2.15E+02 | 3.45E+02 | 1.33E+02 | 1.11E+02 | 1.48E+02 | 2.84E+02 | 6.02E+01 | 1.14E+02 | 1.35E+02 | |
p-value | 1.69E-08 | 2.27E-06 | 1.83E-10 | 1.22E-10 | 2.69E-10 | 8.90E-01 | 2.84E-11 | 4.23E-02 | NaN | |
Rank | 5 | 4 | 8 | 6 | 9 | 1 | 7 | 2 | 2 | |
Mean ranking | 5.00E+00 | 4.00E+00 | 8.25E+00 | 6.75E+00 | 8.00E+00 | 1.75E+00 | 7.00E+00 | 2.38E+00 | 1.63E+00 | |
Final ranking | 5 | 4 | 9 | 6 | 8 | 2 | 7 | 3 | 1 |
This paper has been presented a modified version of a new metaheuristic technique named RSA. This modification depends on using the strength of QM to enhance the ability of RSA to enhance its ability to balance between exploration and exploitation. This will improve the diversity of the solution and increase the convergence rate reflected in the quality of the final solution. To assess the performance of the developed method, a set of experiments have been conducted using standard benchmark functions which have different characteristics. In addition, the results of the proposed QMRSA have been compared with AO, GWO, SCA, WOA, DA, AOA, and traditional RSA. The results show the high ability of the proposed QMRSA to find the best solution over the tested function compared with other methods. In addition, to evaluate the applicability of QMRSA, it has been used as a clustering technique. Since the clustering problem is considered an NP-hard problem, it has several real-world applications in IoT, cloud computing, data mining, etc. The proposed QMRSA has been applied to eight datasets, and the results have been compared with the same algorithms used in the first experiment. It has been observed from clustering results and the statistical Friedman test the high ability of QMRSA to determine the number of clusters and the central points. Moreover, the main advantage of the proposed method is that it can find new best solutions with a high accuracy rate. According to the obtained results, the improved QMRSA can be used for photovoltaic, task scheduling, engineering design problems, and feature selection.
Conceptualization, LA. Methodology, LA. Investigation and methodology, RA, MM, SC, MS, LA, MAE. Supervision, LA. Writing of the original draft, RA, MM, SC, MS, LA, MAE. Software, LA. Validation, LA. Visualization, RA, MM, SC, MS, MAE.
This research project was funded by Princess Nourah bint Abdulrahman University Researchers Supporting Project number (PNURSP2022R239), Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia.
The authors declare that they have no competing interests.
Name : Almodfer Rolla
Affiliation : School of Information Engineering, Henan Institute of Science and Technology, Xinxiang 453003, China
Biography : Almodfer Rolla is an assistant professor at School of Information Engineering, Henan Institute of Science and Technology, Xinxiang 453003, China. Her main research interests are optimization and machine learning.
Name : Mohammed Mudhsh
Affiliation : School of Information Engineering, Henan Institute of Science and Technology, Xinxiang 453003, China
Biography : Mohammed Mudhsh is an assistant professor at School of Information Engineering, Henan Institute of Science and Technology, Xinxiang 453003, China. His main research interests are artificial intelligence, optimization and machine learning
Name : Samia Chelloug
Affiliation : Information Technology Department, College of Computer and Information Sciences, Princess Nourah bint Abdulrahman University
Biography : Samia Chelloug is an assistant professor at Information Technology Department, College of Computer and Information Sciences, Princess Nourah bint Abdulrahman University. Her main research interests are optimization and machine learning, ad hoc networks, concave programming, and network coding.
Name : Mohammad Shehab
Affiliation : Information Technology, The World Islamic Sciences and Education University, Amman, Jordan
Biography : Mohammad Shehab received his B.Sc. from Al-Zaytoonah University of Jordan, Software Engineering, Jordan, in 2009. He received the master degree from the Universiti Sains Malaysia, Computer Science/ Artificial Intelligence, in 2012. Also, He received a Ph.D. degree from Universiti Sains Malaysia, Computer Science/ Artificial Intelligence and Software Engineering, in 2018. Dr. Shehab's research focuses on Metaheuristic algorithms, particularly in the areas of population modeling and parameter control.
Name : Laith Abualigah
Affiliation : Faculty of Computer Sciences and Informatics, Amman Arab University, 11953 Amman, Jordan.
Biography : Laith Abualigah is an Assistant Professor at the Computer Science Department, Amman Arab University, Jordan. He is also a distinguished researcher at the School of Computer Science, Universiti Sains Malaysia, Malaysia. He received his first degree from Al-Albayt University, Computer Information System, Jordan, in 2011. He earned a Master’s degree from Al-Albayt University, Computer Science, Jordan, in 2014. He received a Ph.D. degree from the School of Computer Science in Universiti Sains Malaysia (USM), Malaysia, in 2018. According to the report published by Stanford University in 2020, Abualigah is one of the 2% influential scholars, which depicts the 100,000 top scientists in the world. Abualigah has published more than 100 journal papers and books, which collectively have been cited more than 4300 times (H-index = 32). His main research interests focus on Arithmetic Optimization Algorithm (AOA), Bio-inspired Computing, Nature-inspired Computing, Swarm Intelligence, Artificial Intelligence, Meta-heuristic Modeling, and Optimization Algorithms, Evolutionary Computations, Information Retrieval, Text clustering, Feature Selection, Combinatorial Problems, Optimization, Advanced Machine Learning, Big data, and Natural Language Processing. Abualigah currently serves as an associate editor of the Journal of Cluster Computing (Springer), the Journal of Soft Computing (Springer), and Journal of King Saud University - Computer and Information Sciences (Elsevier).
Name : Mohamed Abd Elaziz
Affiliation : Department of Mathematics, Faculty of Science, Zagazig University, Zagazig 44519, Egypt
Biography : Mohamed Abd Elaziz received the B.S. and M.S. degrees in computer science and the Ph.D. degree in mathematics and computer science from Zagazig University, Egypt, in 2008, 2011, and 2014, respectively. From 2008 to 2011, he was an Assistant Lecturer with the Department of Computer Science. He is currently an Associate Professor with Zagazig University. He has authored or coauthored more than 100 articles. His research interests include metaheuristic technique, cloud computing machine learning, signal processing, image processing, and evolutionary algorithms.br/>
Rolla Almodfer1 , Mohammed Mudhsh1 , Samia Chelloug2 , Mohammad Shehab3 , Laith Abualigah4,5, and Mohamed Abd Elaziz6,7,8, Quantum Mutation Reptile Search Algorithm for Global Optimization and Data Clustering, Article number: 12:30 (2022) Cite this article 1 Accesses
Download citationAnyone you share the following link with will be able to read this content:
Provided by the Springer Nature SharedIt content-sharing initiative