TY - GEN
T1 - A Hybrid Multi-Objective Teaching Learning-Based Optimization Using Reference Points and R2 Indicator
AU - Tahernezhadjavazm, Farajollah
AU - Rankin, Debbie
AU - Coyle, Damien
PY - 2022/6/24
Y1 - 2022/6/24
N2 - Hybrid multi-objective evolutionary algorithms have recently become a hot topic in the domain of metaheuristics. Introducing new algorithms that inherit other algorithms' operators and structures can improve the performance of the algorithm. Here, we proposed a hybrid multi-objective algorithm based on the operators of the genetic algorithm (GA) and teaching learning-based optimization (TLBO) and the structures of reference point-based (from NSGA-III) and R2 indicators methods. The new algorithm (R2-HMTLBO) improves diversity and convergence by using NSGA-III and R2-based TLBO, respectively. Also, to enhance the algorithm performance, an elite archive is proposed. The proposed multi-objective algorithm is evaluated on 19 benchmark test problems and compared to four state-of-the-art algorithms. IGD metric is applied for comparison, and the results reveal that the proposed R2-HMTLBO outperforms MOEA/D, MOMBI-II, and MOEA/IGD-NS significantly in 16/19 tests, 14/19 tests and 13/19 tests, respectively. Furthermore, R2-HMTLBO obtained considerably better results compared to all other algorithms in 4 test problems, although it does not outperform NSGA-III on a number of tests.
AB - Hybrid multi-objective evolutionary algorithms have recently become a hot topic in the domain of metaheuristics. Introducing new algorithms that inherit other algorithms' operators and structures can improve the performance of the algorithm. Here, we proposed a hybrid multi-objective algorithm based on the operators of the genetic algorithm (GA) and teaching learning-based optimization (TLBO) and the structures of reference point-based (from NSGA-III) and R2 indicators methods. The new algorithm (R2-HMTLBO) improves diversity and convergence by using NSGA-III and R2-based TLBO, respectively. Also, to enhance the algorithm performance, an elite archive is proposed. The proposed multi-objective algorithm is evaluated on 19 benchmark test problems and compared to four state-of-the-art algorithms. IGD metric is applied for comparison, and the results reveal that the proposed R2-HMTLBO outperforms MOEA/D, MOMBI-II, and MOEA/IGD-NS significantly in 16/19 tests, 14/19 tests and 13/19 tests, respectively. Furthermore, R2-HMTLBO obtained considerably better results compared to all other algorithms in 4 test problems, although it does not outperform NSGA-III on a number of tests.
KW - Evolutionary algorithm
KW - Multi-objective optimization
KW - Genetic Algorithm
KW - Teaching Learning-based optimization
KW - R2 indicator
KW - Reference directions
KW - Multi-objective evolutionary algorithm (MOEA)
KW - Optimization algorithm
KW - Reference point-based method
KW - NSGA-III
KW - Teaching learning-based optimization (TLBO)
U2 - 10.1145/3533050.3533053
DO - 10.1145/3533050.3533053
M3 - Chapter in a published conference proceeding
T3 - ACM International Conference Proceeding Series
SP - 19
EP - 23
BT - ISMSI 2022 - 6th International Conference on Intelligent Systems, Metaheuristics and Swarm Intelligence
PB - Association for Computing Machinery
CY - United States
T2 - International Conference on Intelligent Systems, Metaheuristics & Swarm Intelligence
Y2 - 9 April 2022 through 10 April 2022
ER -