Document Type: Original Research Paper

Author

Higher Education Complex of Bam, Bam, Iran.

10.22061/jecei.2020.6581.325

Abstract

Background and Objectives: According to the random nature of heuristic algorithms, stability analysis of heuristic ensemble classifiers has particular importance.
Methods: The novelty of this paper is using a statistical method consists of Plackett-Burman design, and Taguchi for the first time to specify not only important parameters, but also optimal levels for them. Minitab and Design Expert software programs are utilized to achieve the stability goals of this research.
Results: The proposed approach is useful as a preprocessing method before employing heuristic ensemble classifiers; i.e., first discover optimal levels of important parameters and then apply these parameters to heuristic ensemble classifiers to attain the best results. Another significant difference between this research and previous works related to stability analysis is the definition of the response variable; an average of three criteria of the Pareto front is used as response variable.Finally, to clarify the performance of this method, obtained optimal levels are applied to a typical multi-objective heuristic ensemble classifier, and its results are compared with the results of using empirical values; obtained results indicate improvements in the proposed method.
Conclusion: This approach can analyze more parameters with less computational costs in comparison with previous works. This capability is one of the advantages of the proposed method.

Keywords

Main Subjects

[1] M. Hosni, I. Abnane, A. Idri, J. M. C. de Gea, J. L. F. Alemán, "Reviewing ensemble classification methods in breast cancer," Computer Methods and Programs in Biomedicine, 177: 89-112, 2019.

[2] X. Fan, S. Hu, J. He, "A dynamic selection ensemble method for target recognition based on clustering and randomized reference classifier," International Journal of Machine Learning and Cybernetics, 10(3): 515-525, 2019.

[3] H. Zhang, H. He, W. Zhang, "Classifier selection and clustering with fuzzy assignment in ensemble model for credit scoring," Neurocomputing, 316: 210-221, 2018.

[4] P. Sidhu, M. P. S. Bhatia, "A novel online ensemble approach to handle concept drifting data streams: diversified dynamic weighted majority, " International Journal of Machine Learning and Cybernetics, 9(1): 37-61, 2018.

[5] R. Kianzad, H. Montazery Kordy, "Automatic sleep stages detection based on EEG signals using combination of classifiers," Journal of Electrical and Computer Engineering Innovations, 1(2): 99-105, 2013.

[6] J. Zhai, S. Zhang, C. Wang, "The classification of imbalanced large data sets based on MapReduce and ensemble of ELM classifiers," International Journal of Machine Learning and Cybernetics, 8(3): 1009-1017, 2017.

[7]  V. S. Costa, A. D. S. Farias, B. Bedregal, R.H. Santiago, A. M. D. P. Canuto, "Combining multiple algorithms in classifier ensembles using generalized mixture functions," Neurocomputing, 313: 402-414, 2018.

[8] T. Takenouchi, S. Ishii, "Binary classifiers ensemble based on Bregman divergence for multi-class classification," Neurocomputing, 273: 424-434, 2018.

[9] M. R. Esmaeili, S. H. Zahiri, S. M. Razavi, "A framework for high-level synthesis of VLSI circuits using a modified moth-flame optimization algorithm," Journal of Electrical and Computer Engineering Innovations, 7(1): 93-110, 2019.

[10] I. Behravan, S. H. Zahiri, S. M. Razavi, R. Trasarti, "Clustering a big mobility dataset using an automatic swarm intelligence-based clustering method," Journal of Electrical and Computer Engineering Innovations, 6(2):2 43-262, 2018.

[11] S. Roostaee, H. R. Ghaffary, “Diagnosis of heart disease based on meta heuristic algorithms and clustering methods,” Journal of Electrical and Computer Engineering Innovations, 4(2): 105-110, 2016.

[12] M. Hasanluo, F. Soleimanian Gharehchopogh, “Software cost estimation by a new hybrid model of particle swarm optimization and k-nearest neighbor algorithms,” Journal of Electrical and Computer Engineering Innovations, 4(1): 49-55, 2016.

[13] Z. K. Pourtaheri, S. H. Zahiri, S. M. Razavi, "Stability investigation of multi-objective heuristic ensemble classifiers," International Journal of Machine Learning and Cybernetics, 10(5): 1109-1121, 2019.

[14] L. Zhang, W. Srisukkham, S. C. Neoh, C. P. Lim, D. Pandit, "Classifier ensemble reduction using a modified firefly algorithm: An empirical evaluation," Expert Systems with Applications, 93: 395-422, 2018.

[15] A. Rahman, B. Verma, "Ensemble classifier generation using non-uniform layered clustering and genetic algorithm," Knowledge-Based Systems, 43: 30–42, 2013.

[16] R. Diao, F. Chao, T. Peng, N. Snooke, Q. Shen, "Feature selection inspired classifier ensemble reduction," IEEE Transactions on Cybernetics, 44(8): 1259-1268, 2014.

[17] P. Shunmugapriya, S. Kanmani, "Optimization of stacking ensemble configurations through artificial bee colony algorithm," Swarm and Evolutionary Computation, 12: 24–32, Oct. 2013.

[18] C. J. Tan, C. P. Lim, Y. N. Cheah, "A multi-objective evolutionary algorithm-based ensemble optimizer for feature selection and classification with neural network models," Neurocomputing, 125: 217–228, 2014.

[19] Z. K. Pourtaheri, S. H. Zahiri, S. M. Razavi, "Design and stability analysis of multi-objective ensemble classifiers," Electronic Letters on Computer Vision and Image Analysis, 15(3): 32-47, 2016.

[20] L. Condra, Reliability improvement with design of experiment, CRC Press,: 8, 2001.

[21] J. Antony, Design of experiments for engineers and scientists, Elsevier,:1-2, 2014.

[22] R. L. Plackett, J. P. Burman, "The design of optimum multifactorial experiments," Biometrika, 33(4): 305-325, 1946.

[23] T. Mori, Taguchi methods: benefits, impacts, mathematics, statistics, and applications, ASME Press,:36, 2011.

[24] J. L. Rosa, A. Robin, M. B. Silva, C. A. Baldan, M. P. Peres, "Electrodeposition of copper on titanium wires: Taguchi experimental design approach," Journal of Materials Processing Technology, 209(3): 1181-1188, 2009.

[25] M. S. Phadke,  Quality engineering using robust design, Prentice-Hall PTR, New Jersey, 1995.

[26] S. K. Karna, R. Sahai, "An overview on Taguchi method," International Journal of Engineering and Mathematical Sciences, 1(1): 1-7, 2012.

[27] R. K. Roy, Design of experiments using the Taguchi approach: 16 steps to product and process improvement, John Wiley & Sons, 2001.

[28] S. Mirjalili, S. M. Mirjalili, A. Lewis, "Grey wolf optimizer," Advances in Engineering Software, 69: 46-61, 2014.

[29] S. J. Nanda, G. Panda, "A survey on nature inspired metaheuristic algorithms for partitional clustering," Swarm and Evolutionary Computation, 16: 1–18, 2014.

[30] N. Sayyadi Shahraki, S. H. Zahiri, "Low-area/low-power CMOS op-amps design based on total optimality index using reinforcement learning approach," Journal of Electrical and Computer Engineering Innovations, 6(2): 193-208, 2018.

[31] D. A. Van Veldhuizen, G. B. Lamont, "Evolutionary computation and convergence to a pareto front," presented at the Genetic Programming Conference: 221-228, 1998.

[32] J. R. Schott, "Fault tolerant design using single and multicriteria genetic algorithm optimization," MS thesis, Massachusetts Institute of Technology, Cambridge, 1995.

[33] M. Arjmand, A. A. Najafi, "Solving a multi-mode bi-objective resource investment problem using meta-heuristic algorithms," Advanced Computational Techniques in Electromagnetics, 1(1): 41-58, 2015.

[34] M. H. Mozaffari, H. Abdy, and S. H. Zahiri, "IPO: an inclined planes system optimization algorithm," Computing and Informatics, 35(1): 222-240, 2016.