Accelerating Random Orthogonal Search for Global Optimization using Crossover
Refereed conference paper presented and published in conference proceedings
CUHK Authors
Full Text
There are no full text file(s) associated with this record. |
Other information
AbstractPure Random Orthogonal Search (PROS) is a parameterless
evolutionary algorithm (EA) that has shown superior performance
when compared to many existing EAs on well-known benchmark functions
with limited search budgets. Its implementation simplicity, computational
efficiency, and lack of hyperparameters make it attractive to
both researchers and practitioners. However, PROS can be inefficient
when the error requirement becomes stringent. In this paper, we propose
an extension to PROS, called Pure Random Orthogonal Search
with Crossover (PROS-C), which aims to improve the convergence rate
of PROS while maintaining its simplicity. We analyze the performance of
PROS-C on a class of functions that are monotonically increasing in each
single dimension. Our numerical experiments demonstrate that, with the
addition of a simple crossover operation, PROS-C consistently and significantly
reduces the errors of the obtained solutions on a wide range
of benchmark functions. Moreover, PROS-C converges faster than Genetic
Algorithms (GA) on benchmark functions with a normalized error
requirement ranging from 0.1 to 0.0001. The results suggest that PROSC
is a promising algorithm for optimization problems that require high
computational efficiency and reasonable precision solutions.
evolutionary algorithm (EA) that has shown superior performance
when compared to many existing EAs on well-known benchmark functions
with limited search budgets. Its implementation simplicity, computational
efficiency, and lack of hyperparameters make it attractive to
both researchers and practitioners. However, PROS can be inefficient
when the error requirement becomes stringent. In this paper, we propose
an extension to PROS, called Pure Random Orthogonal Search
with Crossover (PROS-C), which aims to improve the convergence rate
of PROS while maintaining its simplicity. We analyze the performance of
PROS-C on a class of functions that are monotonically increasing in each
single dimension. Our numerical experiments demonstrate that, with the
addition of a simple crossover operation, PROS-C consistently and significantly
reduces the errors of the obtained solutions on a wide range
of benchmark functions. Moreover, PROS-C converges faster than Genetic
Algorithms (GA) on benchmark functions with a normalized error
requirement ranging from 0.1 to 0.0001. The results suggest that PROSC
is a promising algorithm for optimization problems that require high
computational efficiency and reasonable precision solutions.
Acceptance Date23/06/2023
All Author(s) ListBruce Kwong-Bun Tong, Wing Cheong Lau, Chi Wan Sung, Wing Shing Wong
Name of ConferenceThe 9th Annual Conference on machine Learning, Optimization and Data science (LOD)
Start Date of Conference22/09/2023
End Date of Conference26/09/2023
Place of ConferenceGrasmere, Lake District, England
Country/Region of ConferenceGreat Britain
Year2023
Month9
LanguagesEnglish-United States