An improvement of clustering-based objective reduction method for many-objective optimization problems

Multi-Objective Evolutionary Algorithms (MOEAs) have been widely used for solving

multi-objective optimization problems (MOPs). As the number of objectives is more than

three, MOPs are regarded as many-objective optimization problems (MaOPs). They bring

difficulties to existing MOEAs in terms of deterioration of the search ability, requirement

of exponential population size, visualization of results, and especially computation cost.

Although in real world, numerous optimization problems are considered as MaOPs but they

contain redundant objectives. With these problems, removing the redundant objectives can

alleviate these difficulties of MaOPs. Recently, clustering has been applied to remove redundant objectives such as Mutual Information and Clustering Algorithm (MICA-NORMOEA)

and Objective Clustering-Based Objective Reduction Algorithm (OC-ORA). However, these

clustering-based algorithms are computationally complex and they also manually tune a set

of parameters. This paper proposes a clustering-based objective reduction method (COR) for

MaOPs. The proposed method is more efficient than existing ones and it can automatically

determine these parameters. Moreover, the proposed method still achieves comparable results

with existing methods.

An improvement of clustering-based objective reduction method for many-objective optimization problems trang 1

Trang 1

An improvement of clustering-based objective reduction method for many-objective optimization problems trang 2

Trang 2

An improvement of clustering-based objective reduction method for many-objective optimization problems trang 3

Trang 3

An improvement of clustering-based objective reduction method for many-objective optimization problems trang 4

Trang 4

An improvement of clustering-based objective reduction method for many-objective optimization problems trang 5

Trang 5

An improvement of clustering-based objective reduction method for many-objective optimization problems trang 6

Trang 6

An improvement of clustering-based objective reduction method for many-objective optimization problems trang 7

Trang 7

An improvement of clustering-based objective reduction method for many-objective optimization problems trang 8

Trang 8

An improvement of clustering-based objective reduction method for many-objective optimization problems trang 9

Trang 9

An improvement of clustering-based objective reduction method for many-objective optimization problems trang 10

Trang 10

Tải về để xem bản đầy đủ

pdf 16 trang xuanhieu 1780
Bạn đang xem 10 trang mẫu của tài liệu "An improvement of clustering-based objective reduction method for many-objective optimization problems", để tải tài liệu gốc về máy hãy click vào nút Download ở trên

Tóm tắt nội dung tài liệu: An improvement of clustering-based objective reduction method for many-objective optimization problems

An improvement of clustering-based objective reduction method for many-objective optimization problems
ORc
 LPCA 1.00 0.01 0.00 0.06 LPCA 0 1 1 0
 OC-ORA 0.01 1.00 0.17 0.12 OC-ORA 1 0 0 0
 CORi 0.00 0.17 1.00 0.01 CORi 1 0 0 1
 CORc 0.06 0.12 0.01 1.00 CORc 0 0 1 0
level at 0.05, and the alternative hypothesis is that the performance of the two methods
is significant different. The Table 2a shows p values when comparing two algorithms
with each other. The Table 2b presents the hypotheses are accepted or rejected. The
value 0 means that the null hypothesis is accepted, in other words, the two algorithms
are similar with significant level at 0.05. The value 1 means that the null hypothesis
is rejected or we accept the alternative hypothesis which mean that the two algorithms
are different with significant level at 0.05. As can be seen from Table 2b, the LPCA is
different to OC-ORA, or CORi but similar to CORc at significant level 0.05. CORi is
similar to OC-ORA but different to LPCA, and CORc.
 29
Section on Information and Communication Technology (ICT) - No. 14 (10-2019)
Table 3. The average number of times objective reduction algorithms calling the MOEAs when it finds
 the set of conflicting objectives
 DTLZ5IM objective reductions
 MaOEAs
 I M LPCA OC-ORA CORi CORc
 2 5 2.0 3.8 2.0 2.0
 3 5 2.0 3.0 2.0 2.0
 GrEA 5 10 2.0 6.0 2.0 2.0
 7 10 2.0 4.0 2.0 2.0
 5 20 1.2 15.7 2.1 2.1
 2 5 2.0 3.8 2.0 2.0
 3 5 2.0 3.1 2.3 2.0
 KnEA 5 10 2.0 6.0 2.3 2.0
 7 10 2.0 4.1 2.4 2.0
 5 20 2.0 16.0 2.1 2.6
 2 5 2.0 4.0 2.0 2.0
 3 5 2.0 2.9 2.0 2.0
 NSGAIII 5 10 2.0 6.0 2.0 2.0
 7 10 2.0 4.0 2.0 2.0
 5 20 2.2 15.9 2.5 3.1
 2 5 2.0 3.8 2.0 2.0
 3 5 2.0 3.0 2.0 2.0
 RVEA 5 10 2.0 6.0 2.0 2.0
 7 10 2.0 4.0 2.1 2.0
 5 20 2.0 17.4 2.2 2.1
 2 5 2.0 4.0 2.0 2.0
 3 5 2.0 2.9 2.0 2.0
 θ-DEA 5 10 2.0 6.0 2.0 2.0
 7 10 2.0 4.0 2.1 2.0
 5 20 2.0 16.5 2.1 2.6
 2 5 2.0 4.0 2.1 2.0
 3 5 2.0 3.0 2.1 2.0
 LMEA 5 10 2.0 6.0 2.1 2.0
 7 10 2.0 4.0 2.1 2.0
 5 20 2.3 16.0 2.1 2.3
 Beside the number of times successful in finding non-redundant objective set above,
the number of repeating in reduction process is considered when do comparison the
performance of the objective reduction algorithm. For OC-ORA algorithm, each loop
can reduce at most one redundant objective. Table 3 shows that COR and LPCA can
execute almost 2 loops to finish the process, while OC-ORA need M − I + 1 loops.
As a result, COR and LPCA are equivalent, and they more efficient than the OC-ORA,
specially in high redundant instances.
6. Conclusion and Future Works
 The paper proposed a clustering-based objective reduction method which uses PAM
algorithm for clustering objectives in MaOPs for determining redundant objectives.
The proposed algorithm used Silhouette index to self-determine the number of clusters
30
 Journal of Science and Technique - Le Quy Don Technical University - No. 202 (10-2019)
instead requiring it as a parameter. We compared the proposed method with two other
existing methods on different instances of DTLZ5IM(I,M) problem. The result showed
that the proposed algorithm is not only equivalent to or more efficient than existing
methods but also it can retain comparable results as existing methods can do.
 There are other methods which are able to automatically determine the number of
clusters, future works could investigate these methods for the proposed method.
References
 [1] K. Miettinen, “Nonlinear multiobjective optimization, volume 12 of international series in operations research
 and management science,” 1999.
 [2] A. P. Engelbrecht, Computational intelligence: an introduction. John Wiley & Sons, 2007.
 [3] D. B. Fogel, “The advantages of evolutionary computation.” in BCEC, 1997, pp. 1–11.
 [4] K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, “A fast and elitist multiobjective genetic algorithm: Nsga-ii,”
 IEEE transactions on evolutionary computation, vol. 6, no. 2, pp. 182–197, 2002.
 [5] H. Ishibuchi, N. Tsukamoto, and Y. Nojima, “Evolutionary many-objective optimization: A short review,” in
 Proceedings of the IEEE Congress on Evolutionary Computation, CEC2008, June 1-6, 2008, Hong Kong, 2008,
 pp. 2419–2426.
 [6] S. Bechikh, M. Elarbi, and L. B. Said, “Many-objective optimization using evolutionary algorithms: a survey,”
 in Recent Advances in Evolutionary Multi-objective Optimization. Springer, 2017, pp. 105–137.
 [7] K. Deb and H. Jain, “An evolutionary many-objective optimization algorithm using reference-point-based
 nondominated sorting approach, part i: Solving problems with box constraints.” IEEE Trans. Evolutionary
 Computation, vol. 18, no. 4, pp. 577–601, 2014.
 [8] D. K. Saxena, J. A. Duro, A. Tiwari, K. Deb, and Q. Zhang, “Objective reduction in many-objective
 optimization: Linear and nonlinear algorithms,” IEEE Transactions on Evolutionary Computation, vol. 17,
 no. 1, pp. 77–99, 2013.
 [9] Y. Yuan, Y.-S. Ong, A. Gupta, and H. Xu, “Objective reduction in many-objective optimization: evolutionary
 multiobjective approaches and comprehensive/critical analysis,” IEEE Transactions on Evolutionary Computa-
 tion, vol. 22, no. 2, pp. 189–210, 2018.
[10] X. H. Nguyen, L. T. Bui, and C. T. Tran, “Improving many objective optimisation algorithms using objective
 dimensionality reduction,” Evolutionary Intelligence, pp. 1–16, 2019.
[11] H. K. Singh, A. Isaacs, and T. Ray, “A pareto corner search evolutionary algorithm and dimensionality reduction
 in many-objective optimization problems,” IEEE Transactions on Evolutionary Computation, vol. 15, no. 4,
 pp. 539–556, 2011.
[12] D. Brockhoff and E. Zitzler, “Objective reduction in evolutionary multiobjective optimization: Theory and
 applications,” Evolutionary computation, vol. 17, no. 2, pp. 135–166, 2009.
[13] F. Gu, H.-L. Liu, and Y.-m. Cheung, “A fast objective reduction algorithm based on dominance structure for
 many objective optimization,” in Asia-Pacific Conference on Simulated Evolution and Learning. Springer,
 2017, pp. 260–271.
[14] D. Brockhoff and E. Zitzler, “On objective conflicts and objective reduction in multiple criteria optimization,”
 TIK Report, vol. 243, 2006.
[15] D. K. Saxena and K. Deb, “Non-linear dimensionality reduction procedures for certain large-dimensional
 multi-objective optimization problems: Employing correntropy and a novel maximum variance unfolding,”
 in International Conference on Evolutionary Multi-Criterion Optimization. Springer, 2007, pp. 772–787.
[16] A. L. Jaimes, C. A. C. Coello, and J. E. U. Barrientos, “Online objective reduction to deal with many-objective
 problems,” in International Conference on Evolutionary Multi-Criterion Optimization. Springer, 2009, pp.
 423–437.
[17] X. H. Nguyen, L. T. Bui, and C. T. Tran, “Clustering method using pareto corner search evolutionary
 algorithm for objective reduction in many-objective optimization problems,” in Proceedings of the Tenth
 International Symposium on Information and Communication Technology, ser. SoICT 2019. New York, NY,
 USA: ACM, 2019, pp. 78–84. [Online]. Available: 
[18] A. López Jaimes, C. A. Coello Coello, and D. Chakraborty, “Objective reduction using a feature selection
 technique,” in Proceedings of the 10th annual conference on Genetic and Evolutionary Computation. ACM,
 2008, pp. 673–680.
[19] X. Guo, X. Wang, M. Wang, and Y. Wang, “A new objective reduction algorithm for many-objective problems:
 31
Section on Information and Communication Technology (ICT) - No. 14 (10-2019)
 employing mutual information and clustering algorithm,” in Computational Intelligence and Security (CIS),
 2012 Eighth International Conference on. IEEE, 2012, pp. 11–16.
[20] X. Guo, Y. Wang, and X. Wang, “Using objective clustering for solving many-objective optimization problems,”
 Mathematical Problems in Engineering, vol. 2013, 2013.
[21] E. Zitzler, M. Laumanns, and L. Thiele, “Spea2: Improving the strength pareto evolutionary algorithm,” TIK-
 report, vol. 103, 2001.
[22] J. D. Knowles and D. W. Corne, “Approximating the nondominated front using the pareto archived evolution
 strategy,” Evolutionary computation, vol. 8, no. 2, pp. 149–172, 2000.
[23] Q. Zhang and H. Li, “Moea/d: A multiobjective evolutionary algorithm based on decomposition,” IEEE
 Transactions on evolutionary computation, vol. 11, no. 6, pp. 712–731, 2007.
[24] E. J. Hughes, “Multiple single objective pareto sampling,” in Congress on Evolutionary Computation 2003,
 2003, pp. 2678–2684.
[25] E. Zitzler and S. Kunzli,¨ “Indicator-based selection in multiobjective search,” in International Conference on
 Parallel Problem Solving from Nature. Springer, 2004, pp. 832–842.
[26] S. Yang, M. Li, X. Liu, and J. Zheng, “A grid-based evolutionary algorithm for many-objective optimization,”
 IEEE Transactions on Evolutionary Computation, vol. 17, no. 5, pp. 721–736, 2013.
[27] X. Zhang, Y. Tian, and Y. Jin, “A knee point-driven evolutionary algorithm for many-objective optimization,”
 IEEE Transactions on Evolutionary Computation, vol. 19, no. 6, pp. 761–776, 2015.
[28] C. O. S. Sorzano, J. Vargas, and A. P. Montano, “A survey of dimensionality reduction techniques,” arXiv
 preprint arXiv:1403.2877, 2014.
[29] Y.-m. Cheung and F. Gu, “Online objective reduction for many-objective optimization problems,” in Evolution-
 ary Computation (CEC), 2014 IEEE Congress on. IEEE, 2014, pp. 1165–1171.
[30] Y.-m. Cheung, F. Gu, and H.-L. Liu, “Objective extraction for many-objective optimization problems: Algorithm
 and test problems,” IEEE Transactions on Evolutionary Computation, vol. 20, no. 5, pp. 755–772, 2016.
[31] D. Brockhoff and E. Zitzler, “Dimensionality reduction in multiobjective optimization with (partial) dominance
 structure preservation: Generalized minimum objective subset problems,” TIK Report, vol. 247, 2006.
[32] ——, “Are all objectives necessary? on dimensionality reduction in evolutionary multiobjective optimization,”
 in Parallel Problem Solving from Nature-PPSN IX. Springer, 2006, pp. 533–542.
[33] H. X. Nguyen, L. T. Bui, and C. T. Tran, “A Pareto Corner Search Evolutionary Algorithm and Principal
 Component Analysis for Objective Dimensionality Reduction,” in 2019 11th International Conference on
 Knowledge and Systems Engineering (KSE) (KSE’19), Da Nang, Vietnam, Oct. 2019.
[34] Y. Yuan, Y.-S. Ong, A. Gupta, and H. Xu, “Objective reduction in many-objective optimization: evolutionary
 multiobjective approaches and comprehensive analysis,” IEEE Transactions on Evolutionary Computation,
 vol. 22, no. 2, pp. 189–210, 2018.
[35] L. Kaufman and P. Rousseeuw, Clustering by means of medoids. North-Holland, 1987.
[36] P. J. Rousseeuw, “Silhouettes: a graphical aid to the interpretation and validation of cluster analysis,” Journal
 of computational and applied mathematics, vol. 20, pp. 53–65, 1987.
[37] K. Deb and D. K. Saxena, “On finding pareto-optimal solutions through dimensionality reduction for certain
 large-dimensional multi-objective optimization problems,” Kangal report, vol. 2005011, 2005.
[38] R. Cheng, Y. Jin, M. Olhofer, and B. Sendhoff, “A reference vector guided evolutionary algorithm for many-
 objective optimization.” IEEE Trans. Evolutionary Computation, vol. 20, no. 5, pp. 773–791, 2016.
[39] Y. Yuan, H. Xu, B. Wang, and X. Yao, “A new dominance relation-based evolutionary algorithm for many-
 objective optimization,” IEEE Transactions on Evolutionary Computation, vol. 20, no. 1, pp. 16–37, 2016.
[40] X. Zhang, Y. Tian, R. Cheng, and Y. Jin, “A decision variable clustering-based evolutionary algorithm for
 large-scale many-objective optimization,” IEEE Transactions on Evolutionary Computation, vol. 22, no. 1, pp.
 97–112, 2018.
[41] Y. Tian, R. Cheng, X. Zhang, and Y. Jin, “Platemo: A matlab platform for evolutionary multi-objective
 optimization [educational forum],” IEEE Computational Intelligence Magazine, vol. 12, no. 4, pp. 73–87, 2017.
 Manuscript received 07-5-2019; Accepted 18-12-2019.
 
32
Journal of Science and Technique - Le Quy Don Technical University - No. 202 (10-2019)
 Nguyen Xuan Hung received his B.Sc (1998) and M.Sc (2008) degrees in computer science
 and technology from Le Quy Don Technical University. Currently he is a PhD Candidate at
 the Faculty of Information Technology, Le Quy Don Technical University. His current research
 interests include Evolutionary Multi-Objective/Many-Objective Optimization and Objective
 Dimensionality Reduction. His email is nguyenxuanhung@outlook.com
 Bui Thu Lam received the Ph.D. degree in computer science from the University of New
 South Wales (UNSW), Australia, in 2007. He did postdoctoral training at UNSW from 2007
 until 2009. He has been involved with academics including teaching and research since 1998.
 Currently, he is an Associate Professor at Le Quy Don Technical University, Hanoi, Vietnam.
 He is doing research in the field of evolutionary computation, specialized with evolutionary
 multi-objective optimization.
 Tran Cao Truong received the B.Sc in applied mathematics and informatics from from Ha
 Noi University of Science, Vietnam, in 2005 and the M.S. degree from Le Qui Don Technical
 University, Viet Nam in 2009. He received the PhD degree in computer science from Victoria
 University of Wellington, New Zealand, in 2018. He is researching in the field of evolutionary
 computation, specialized with evolutionary machine learning for data mining with missing data.
 He servers as a reviewer of international journals, including IEEE Transactions on Evolution-
 ary Computation, IEEE Transactions on Cybernetics, Pattern Recognition, Knowledge-Based
 Systems, Applied Soft Computing and Engineering Application of Artificial Intelligence. He
 is also a PC member of international conferences, including IEEE Congress on Evolutionary
 Computation, IEEE Symposium Series on Computational Intelligence, the Australasian Joint
 Conference on Artificial Intelligence and the AAAI Conference on Artificial Intelligence.
 33
Section on Information and Communication Technology (ICT) - No. 14 (10-2019)
 GIẢI THUẬT GIẢM CHIỀU MỤC TIÊU CẢI TIẾN
 SỬ DỤNG PHƯƠNG PHÁP PHÂN CỤM
 CHO BÀI TOÁN TỐI ƯU NHIỀU MỤC TIÊU
 Tóm tắt
 Bài toán tối ưu nhiều mục tiêu đã nhận được sự quan tâm đáng kể từ các nhà nghiên
 cứu gần đây. Khi số lượng mục tiêu tăng lên, chúng gây ra những khó khăn cho các thuật
 toán tiến hóa đa mục tiêu hiện tại về toán tử lựa chọn, yêu cầu kích thước quần thể lớn, việc
 trực quan hóa tập kết quả, đặc biệt là chi phí tính toán. Một số bài toán có mục tiêu dư thừa
 và có thể bị thoái hóa thành các bài toán có kích thước thấp hơn. Việc loại bỏ các mục tiêu
 dư thừa sẽ tiết kiệm thời gian và không gian khi giải quyết bài toán. Bài báo này cải tiến
 phương pháp giảm kích thước sử dụng thuật toán PAM thực hiện phân cụm các mục tiêu sau
 đó loại bỏ các mục tiêu dư thừa. Hai biến thể của thuật toán đề xuất tự xác định số mục tiêu
 dư thừa và hiệu quả hơn thuật toán gốc.
34

File đính kèm:

  • pdfan_improvement_of_clustering_based_objective_reduction_metho.pdf