Hyper - Volume evolutionary algorithm
We propose a multi-objective evolutionary algorithm (MOEA), named the Hyper-volume Evolutionary
Algorithm (HVEA). The algorithm is characterised by three components. First, individual fitness evaluation
depends on the current Pareto front, specifically on the ratio of its dominated hyper-volume to the current Pareto
front hyper-volume, hence giving an indication of how close the individual is to the current Pareto front. Second,
a ranking strategy classifies individuals based on their fitness instead of Pareto dominance, individuals within
the same rank are non guaranteed to be mutually non-dominated. Third, a crowding assignment mechanism that
adapts according to the individual’s neighbouring area, controlled by the neighbouring area radius parameter,
and the archive of non-dominated solutions. We perform extensive experiments on the multiple 0/1 knapsack
problem using different greedy repair methods to compare the performance of HVEA to other MOEAs including
NSGA2, SEAMO2, SPEA2, IBEA and MOEA/D. This paper shows that by tuning the neighbouring area
radius parameter, the performance of the proposed HVEA can be pushed towards better convergence, diversity or
coverage and this could be beneficial to different types of problems.
Trang 1
Trang 2
Trang 3
Trang 4
Trang 5
Trang 6
Trang 7
Trang 8
Trang 9
Trang 10
Tải về để xem bản đầy đủ
Tóm tắt nội dung tài liệu: Hyper - Volume evolutionary algorithm
is suggested to use Instance hv1 hv100 mo/t mo/w ω = 1.0 for better diversity and better coverage ks2 250 4 5 3 4 and 0.01 ≤ ω ≤ 0.05 for better convergence and ks2 500 8 9 8 9 faster computational time. ks2 750 15 16 17 19 ks3 250 16 26 40 38 ks3 500 29 84 118 133 Table 13: Generational distance ks3 750 46 177 172 181 (permutation encoding) ks4 250 47 156 357 315 External Population Extracted Population ks4 500 112 602 1025 972 Instance hv1 hv100 mo/t mo/w hv1 hv100 mo/t mo/w ks4 750 163 870 1784 1807 ks2 250 3.16 3.21 5.98 7.00 3.19 3.23 6.18 7.18 ks2 500 10.00 10.17 17.57 16.05 10.08 10.33 18.64 16.84 Table 16: Generational distance ks2 750 17.49 17.06 29.28 24.43 17.78 17.40 33.04 26.86 ks3 250 2.08 2.84 4.64 3.86 7.99 9.90 14.77 13.18 (binary encoding) ks3 500 4.97 6.81 10.31 6.84 16.72 27.44 37.67 26.73 ks3 750 9.29 12.21 17.22 9.99 26.94 48.53 59.89 38.67 External Population Extracted Population ks4 250 2.24 3.75 3.80 2.92 14.11 30.34 20.63 17.17 Instance hv1 hv100 mo/t mo/w hv1 hv100 mo/t mo/w ks4 500 4.23 6.79 8.21 5.11 24.22 57.09 39.98 34.94 ks2 250 3.91 3.78 4.47 7.28 3.91 3.78 4.54 7.29 ks4 750 6.79 10.63 13.32 7.47 34.17 86.46 61.01 51.07 ks2 500 11.42 11.03 15.53 17.39 11.42 11.03 15.53 17.39 ks2 750 26.86 27.10 34.07 34.88 26.86 27.10 34.07 34.88 ks3 250 3.26 4.08 4.77 4.57 10.18 12.28 14.33 14.01 ks3 500 10.67 10.69 12.18 10.15 24.41 31.16 37.44 32.18 Table 14: Inverted generational distance ks3 750 22.54 22.11 23.62 19.32 38.93 53.36 64.60 52.75 (permutation encoding) ks4 250 3.26 4.46 4.08 3.63 16.27 26.40 21.17 19.24 ks4 500 9.37 11.80 10.06 8.09 35.47 69.12 49.77 44.30 ks4 750 18.56 20.13 16.93 12.85 57.80 114.62 80.97 71.23 External Population Extracted Population Instance hv1 hv100 mo/t mo/w hv1 hv100 mo/t mo/w ks2 250 10.13 9.06 3.07 4.26 10.13 9.06 3.07 4.26 Table 17: Inverted generational distance ks2 500 10.96 9.98 6.60 6.55 10.96 9.98 6.60 6.55 (binary encoding) ks2 750 48.07 46.02 37.36 31.90 48.07 46.02 37.37 31.90 ks3 250 15.86 5.22 5.20 6.26 15.95 6.52 7.83 8.37 External Population Extracted Population ks3 500 42.80 15.02 16.43 15.10 42.85 16.99 19.83 18.18 Instance hv1 hv100 mo/t mo/w hv1 hv100 mo/t mo/w ks3 750 63.84 25.89 30.90 23.10 63.89 27.77 34.74 27.54 ks2 250 14.55 13.88 2.78 5.35 14.55 13.88 2.79 5.35 ks4 250 14.97 5.32 5.76 5.91 15.08 11.06 12.70 9.77 ks2 500 17.97 17.91 5.36 7.37 17.97 17.91 5.36 7.37 ks4 500 38.54 14.94 16.82 13.49 38.60 22.45 31.17 20.77 ks2 750 81.69 81.09 33.08 35.57 81.69 81.09 33.08 35.57 ks4 750 63.77 29.00 31.73 22.26 63.87 36.84 49.64 31.97 ks3 250 16.31 8.36 5.22 8.37 16.36 8.93 7.41 9.67 ks3 500 41.47 25.90 17.05 24.17 41.50 26.38 20.07 25.59 ks3 750 61.05 42.30 30.87 34.95 61.08 42.67 33.83 36.71 We further examine the performance of HVEA ks4 250 14.87 6.95 6.10 8.84 14.97 10.12 11.59 10.91 against other MOEAs using a fixed running time, ks4 500 34.46 19.55 17.22 19.04 34.53 24.15 24.86 22.39 although this approach is not widely adopted ks4 750 56.85 36.89 32.02 32.62 56.92 41.66 43.02 36.28 due to its low reliability. The running time (in Table 18: Running time in seconds (binary) seconds), reported in Table 25, is deduced from the minimum running time of all MOEAs on Instance hv1 hv100 mo/t mo/w each knapsack instances over 2000 generations. ks2 250 8 8 5 5 The running time for the linear scalarising greedy ks2 500 18 19 13 13 ks2 750 32 33 25 25 repair method proposed by Jaszkiewicz [13] ks3 250 25 35 18 18 is twice as much as the ones proposed by ks3 500 39 56 39 40 Mumford [12] and Zitzler and Thiele [11]. ks3 750 56 74 59 57 Under the time restriction, HVEA outperforms ks4 250 55 139 113 87 ks4 500 82 285 359 342 NSGA2, SPEA2, IBEA+ and IBEAHV . HVEA ks4 750 110 427 536 538 only outperforms SEAMO2 on 2,3-knapsack problems. SEAMO2 is slightly better than HVEA on 4-knapsack problems. It is suggested that due evaluations than any other MOEAs which could to the steady-state approach and the simple Pareto lead to a better performance. HVEA is slightly dominance, SEAMO2 is able to perform more worse than but competitive to MOEA/D. 28 K.N. Le, D. Landa-Silva / VNU Journal of Science: Comp. Science & Com. Eng., Vol. 32, No. 1 (2016) 10–32 0.85 0.80 0.84 0.80 0.80 0.75 0.76 0.75 0.70 0.72 0.70 0.65 hv1 hv1 hv1 mo/t mo/t mo/t ehv1 ehv1 ehv1 mo/w mo/w mo/w emo/t emo/t emo/t hv100 hv100 hv100 emo/w emo/w emo/w ehv100 ehv100 ehv100 (a) ks2 250 (b) ks2 500 (c) ks2 750 0.55 0.52 0.60 0.48 0.50 0.55 0.44 0.45 0.50 0.40 0.40 hv1 hv1 hv1 mo/t mo/t mo/t ehv1 ehv1 ehv1 mo/w mo/w mo/w emo/t emo/t emo/t hv100 hv100 hv100 emo/w emo/w emo/w ehv100 ehv100 ehv100 (d) ks3 250 (e) ks3 500 (f) ks3 750 0.45 0.32 0.36 0.28 0.40 0.24 0.30 0.35 0.20 0.24 0.30 hv1 hv1 hv1 mo/t mo/t mo/t ehv1 ehv1 ehv1 mo/w mo/w mo/w emo/t emo/t emo/t hv100 hv100 hv100 emo/w emo/w emo/w ehv100 ehv100 ehv100 (g) ks4 250 (h) ks4 500 (i) ks4 750 Fig. 7: The percentage of hypervolume (binary encoding). Table 19: Generational distance Table 20: Inverted generational distance (te binary encoding) (te binary encoding) External Population Extracted Population External Population Extracted Population Instance hv1 hv100 mo/t hv1 hv100 mo/t Instance hv1 hv100 mo/t hv1 hv100 mo/t ks2 250 4.40 4.43 2.83 4.41 4.45 3.05 ks2 250 3.06 2.93 1.52 3.06 2.93 1.53 ks2 500 16.33 16.60 9.88 16.33 16.60 10.03 ks2 500 6.88 6.80 3.69 6.88 6.80 3.69 ks2 750 44.67 43.98 23.54 44.67 43.98 23.63 ks2 750 44.16 42.82 26.22 44.16 42.82 26.22 ks3 250 5.39 6.09 4.54 14.62 15.61 14.80 ks3 250 10.22 6.18 5.03 10.66 7.68 7.45 ks3 500 16.66 15.73 10.49 37.71 42.51 37.10 ks3 500 30.69 19.09 15.73 30.86 20.57 18.94 ks3 750 35.66 30.38 19.88 66.81 77.08 65.21 ks3 750 47.42 35.89 31.87 47.63 37.57 35.11 ks4 250 4.91 5.75 3.61 22.65 34.78 20.41 ks4 250 10.89 7.15 5.52 11.17 12.61 12.42 ks4 500 14.45 14.45 8.09 52.51 83.35 46.14 ks4 500 29.43 22.33 16.88 29.56 29.20 24.87 ks4 750 30.20 24.54 13.47 93.91 141.43 65.90 ks4 750 49.19 42.32 32.32 49.37 50.01 42.92 K.N. Le, D. Landa-Silva / VNU Journal of Science: Comp. Science & Com. Eng., Vol. 32, No. 1 (2016) 10–32 29 0.86 0.86 0.81 0.84 0.85 0.78 0.82 0.84 0.75 0.80 0.83 0.72 hv1 hv1 hv1 mo/t mo/t mo/t ehv1 ehv1 ehv1 emo/t emo/t emo/t hv100 hv100 hv100 ehv100 ehv100 ehv100 (a) ks2 250 (b) ks2 500 (c) ks2 750 0.56 0.52 0.60 0.52 0.48 0.57 0.48 0.44 0.54 0.40 0.44 hv1 hv1 hv1 mo/t mo/t mo/t ehv1 ehv1 ehv1 emo/t emo/t emo/t hv100 hv100 hv100 ehv100 ehv100 ehv100 (d) ks3 250 (e) ks3 500 (f) ks3 750 0.30 0.45 0.36 0.26 0.40 0.30 0.22 0.35 0.24 0.18 0.30 hv1 hv1 hv1 mo/t mo/t mo/t ehv1 ehv1 ehv1 emo/t emo/t emo/t hv100 hv100 hv100 ehv100 ehv100 ehv100 (g) ks4 250 (h) ks4 500 (i) ks4 750 Fig. 8: The percentage of hypervolume (te binary encoding). Table 21: Running time in seconds (te binary) Table 22: Generational distance (ws binary encoding) Instance hv1 hv100 mo/t ks2 250 14 13 9 External Population Extracted Population ks2 500 33 32 25 Instance hv1 hv100 mo/w hv1 hv100 mo/w ks2 750 62 59 51 ks2 250 1.91 1.85 1.81 2.01 1.95 1.96 ks2 500 6.13 6.07 3.74 6.13 6.07 4.23 ks3 250 26 42 24 ks2 750 16.53 16.38 8.10 16.53 16.38 9.08 ks3 500 58 81 60 ks3 250 2.61 3.24 1.84 9.82 10.42 10.28 ks3 750 102 126 105 ks3 500 7.64 8.11 3.15 21.95 25.21 18.53 ks4 250 49 164 98 ks3 750 14.23 14.80 5.15 33.44 41.42 27.36 ks4 500 92 331 424 ks4 250 3.02 3.66 1.56 17.80 24.37 16.10 ks4 750 157 527 769 ks4 500 7.42 9.11 2.72 35.33 54.99 30.81 ks4 750 14.49 16.17 4.34 58.33 93.31 44.10 30 K.N. Le, D. Landa-Silva / VNU Journal of Science: Comp. Science & Com. Eng., Vol. 32, No. 1 (2016) 10–32 0.87 0.84 0.86 0.85 0.81 0.83 0.78 0.84 0.81 0.75 hv1 hv1 hv1 ehv1 ehv1 ehv1 mo/w mo/w mo/w hv100 hv100 hv100 emo/w emo/w emo/w ehv100 ehv100 ehv100 (a) ks2 250 (b) ks2 500 (c) ks2 750 0.65 0.64 0.64 0.60 0.56 0.60 0.55 0.50 0.48 0.56 hv1 hv1 hv1 ehv1 ehv1 ehv1 mo/w mo/w mo/w hv100 hv100 hv100 emo/w emo/w emo/w ehv100 ehv100 ehv100 (d) ks3 250 (e) ks3 500 (f) ks3 750 0.5 0.42 0.5 0.4 0.35 0.4 0.3 0.28 0.21 0.3 0.2 hv1 hv1 hv1 ehv1 ehv1 ehv1 mo/w mo/w mo/w hv100 hv100 hv100 emo/w emo/w emo/w ehv100 ehv100 ehv100 (g) ks4 250 (h) ks4 500 (i) ks4 750 Fig. 9: The percentage of hypervolume (ws binary encoding). Table 23: Inverted generational distance Table 24: Running time in seconds (ws binary) (ws binary encoding) Instance hv1 hv100 mo/w External Population Extracted Population ks2 250 9 13 8 Instance hv1 hv100 mo/w hv1 hv100 mo/w ks2 500 21 30 22 ks2 250 3.80 3.39 1.20 3.80 3.40 1.21 ks2 750 39 55 43 ks2 500 7.02 6.62 1.66 7.02 6.62 1.66 ks2 750 41.14 40.31 10.29 41.14 40.31 10.31 ks3 250 21 44 33 ks3 250 10.80 4.31 2.85 11.13 6.13 8.32 ks3 500 41 82 107 ks3 500 31.98 14.04 6.71 32.08 15.44 13.18 ks3 750 66 124 137 ks3 750 48.87 24.76 11.37 48.95 25.93 21.12 ks4 250 47 180 339 ks4 250 11.45 4.99 2.97 11.66 10.52 13.93 ks4 500 75 326 885 ks4 500 30.56 14.94 6.90 30.66 21.67 31.76 ks4 750 111 500 1377 ks4 750 50.08 29.11 12.00 50.20 36.67 47.92 K.N. Le, D. Landa-Silva / VNU Journal of Science: Comp. Science & Com. Eng., Vol. 32, No. 1 (2016) 10–32 31 Table 25: Running Time (in seconds) for The Multiple repair methods. In order to assess the 0/1 Knapsack Problem performance of MOEAs fairly, the same greedy Instance Mumford [12] Jaszkiewicz [13] repair method should be used. Furthermore, we & Zitzler [11] suggest that the greedy repair method proposed ks2 250 5 10 by Mumford [12] should be deployed while ks2 500 15 30 assessing the performance of MOEAs on the ks2 750 30 60 multiple 0/1 knapsack problems in order to ks3 250 15 30 minimise the effect of the greedy repair method ks3 500 30 60 on the overall performance. ks3 750 45 90 For future work, it would be interesting ks4 250 30 60 to investigate the incorporation of the steady- ks4 500 45 90 state selection, deployed by SEAMO2 and ks4 750 60 120 MOEA/D, into fitness and/or ranking assignment population-based MOEAs. The reason behinds this it that the steady-state selection allows 5. Conclusions strong offspring to immediately participate in the recombination of the current generation which In this paper we proposed a multi-objective could lead to a better convergence whereas the evolutionary algorithm called Hyper-volume fitness and/or ranking assignment population- Evolutionary Algorithm (HVEA) which based MOEAs consider the whole population incorporates distinctive mechanisms for fitness during the environmental selection which could assignment, ranking and crowding. The lead to a better diversity. algorithm uses hyper-volume, Pareto dominance and distance between neighbouring solutions References in order to drive improvement of the current Pareto front. Experiments conducted in this [1] K. Le, D. Landa-Silva, H. Li, An improved version paper using the multiple 0/1 knapsack problem of volume dominance for multi-objective optimisation, show that HVEA outperforms or remains very in: Proceedings of the 2009 International Conference on Evolutionary Multi-criterion Optimization (EMO competitive against other MOEAs including 2009), Vol. 5467 of Lecture Notes in Computer NSGS2, SEAMO2, SPEA2, IBEA+ , IBEAHV Science, Springer, 2009, pp. 231–245. and the high-performing MOEA/D. HVEA [2] K. Deb, A. Pratap, S. Agarwal, T. Meyarivan, A fast proposes a new individual fitness assignment and elitist multiobjective genetic algorithm: Nsga-ii, IEEE Transactions on Evolutionary Computation 6 (2) strategy using the hyper-volume of an individual (2002) 182–197. without the requirement of determining the [3] C. Mumford, Simple population replacement reference point for the hyper-volume calculation. strategies for a steady-state multi-objective HVEA assesses the crowding of an individual by evolutionary algorithm, in: Genetic and Evolutionary Computation GECCO 2004, Vol. 3102 of Lecture considering all individuals in its neighbourhood. Notes in Computer Science, Springer, 2004, pp. By tuning the only parameter ω, the neighbouring 1389–1400. area radius, the performance of the algorithm [4] E. Zitzler, M. Laumanns, L. Thiele, Spea2: Improving can be driven to attain a non-dominated set with the strength pareto evolutionary algorithm for specific characteristics. That is, the HVEA can be multiobjective optimization, in: Evolutionary Methods for Design, Optimisation and Control with tuned to perform better on convergence, diversity Application to Industrial Problems (EUROGEN or coverage and this could be useful in different 2001), International Center for Numerical Methods in applications when solving multi-objective Engineering (CIMNE), 2002, pp. 95–100. optimisation problems. [5] E. Zitzler, S. Kunzli,¨ Indicator-based selection in multiobjective search, in: Parallel Problem Solving The paper also extensively studied the multiple from Nature (PPSN VIII), Springer-Verlag, 2004, pp. 0/1 knapsack problem using different greedy 832–842. 32 K.N. Le, D. Landa-Silva / VNU Journal of Science: Comp. Science & Com. Eng., Vol. 32, No. 1 (2016) 10–32 [6] Q. Zhang, H. Li, Moea/d: A multiobjective ii: The direction-based multiobjective evolutionary evolutionary algorithm based on decomposition, IEEE algorithm-ii, Soft Computing (2013) 1–16. Transactions on Evolutionary Computation 11 (6) [11] E. Zitzler, L. Thiele, Multiobjective evolutionary (2007) 712–731. algorithms: A comparative case study and the strength [7] H. Li, D. Landa-Silva, An adaptive evolutionary multi- pareto approach, IEEE Transactions on Evolutionary objective approach based on simulated annealing, MIT Computation 3 (4) (1999) 257–271. Evolutionary Algorithm 19 (4) (2011) 561–595. [12] C. L. Valenzuela, A simple evolutionary algorithm for [8] N. Beume, B. Naujoks, M. Emmerich, Sms-emoa: multi-objective optimization (seamo), in: Proceedings Multiobjective selection based on dominated of the 2002 Congress on Evolutionary Computation - hypervolume, European Journal of Operational CEC 2002, 2002, pp. 717–722. Research 181 (3) (2007) 1653–1669. [13] A. Jaszkiewicz, On the performance of multiple- [9] L. T. Bui, H. A. Abbass, M. Barlow, S. Wesolkowski, objective genetic local search on the0/1 knapsack A. Bender, J. Liu, Dmea: A direction-based problem - a comparative experiment, IEEE multiobjective evolutionary algorithm, Memetic Transactions on Evolutionary Computation 6 (4) Computing 3 (4) (2011) 271–285. (2002) 402–412. [10] L. Nguyen, L. T. Bui, H. A. Abbass, Dmea-
File đính kèm:
- hyper_volume_evolutionary_algorithm.pdf