Authors

1 Young Researchers and Elite Club, Yasooj Branch, Islamic Azad University, Yasooj, Iran

2 Young Researchers and Elite Club, Nourabad Mamasani Branch, Islamic Azad University, Nourabad Mamasani, Iran

3 Sama Technical and Vocational Training College, Azad University of Shiraz, Shiraz, Iran

Abstract

Assigning a set of objects to groups such that objects in one group or cluster are more similar to each other than the other clusters’ objects is the main task of clustering analysis. SSPCO optimization algorithm is a
new optimization algorithm that is inspired by the behavior of a type of bird called see-see partridge. One of the things that smart algorithms are applied to solve is the problem of clustering. Clustering is employed as a
powerful tool in many data mining applications, data analysis, and data compression in order to group data on the number of clusters (groups). In the present article, a chaotic SSPCO algorithm is utilized for clustering
data on different benchmarks and datasets; moreover, clustering with artificial bee colony algorithm and particle mass 9 clustering technique is compared. Clustering tests have been done on 13 datasets from UCI
machine learning repository. The results show that clustering SSPCO algorithm is a clustering technique which is very efficient in clustering multivariate data.

Keywords

[1] R. Rajabioun, “Cuckoo Optimization Algorithm,” Applied Soft Computing, 2011.
[2] J. Kennedy, and R. Eberhart, “Particle Swarm Optimization,” Proceedings of IEEE International Conference on Neural Networks, 1995.
[3] M. Dorigo, M. Birattari, and T. Stutzle, “Ant Colony Optimization: Artificial Ants as a Computational Intelligence Technique,” IEEE Computational Intelligence Magazine, 2006.
[4] S. Arora, and S. Singh, “The Firefly Optimization Algorithm: Convergence Analysis and Parameter Selection,” International Journal of Computer Applications, Vol. 69. 3, 2013.
[5] L. Fister, L. Fister Jr, X. She yang, and J. Brest, “A comprehensive review of firefly algorithm,” Swarm and Evolutionary Computation, vol. 13, pp. 34-46, Des. 2013.
[6] D. Karaboga, and B. Basturk, “On the performance of artificial bee colony algorithm,” Applied Soft Computing, vol. 8, 2008.
[7] D. Pham, A. Ghanbarzadeh, A. Koc, S. Otri, S. Rahim, and M. Zaidi, “The bees algorithm, Technical note, Cardiff university,” UK: Manufactoring Engineering center, 2005.
[8] J. Han, and M. Kamber, “Data mining: Concept and Techniques,”Morgan Kaufmann publisher, 2001.
[9] D. J. Hand, H. Mannila, and P. Smyte, “Principles of Data Mining,” The MIT Press, 2001.
[10] M.P. Veyssieres, and R.E. Plant , “Identification of vegetation state and transition domains in California’s hardwood rangelands,” University of California, 1998.
[11] R. Xu, and D. Wunsch, “Survey of Clustering Algorithms,” IEEE TRANSACTIONS ON NEURAL NETWORKS, vol. 16. 3, 2005.
[12] A. Barladi, E. Alpaydin, “Constructive feedforward ART clustering networks,” Part I and II. IEEE Trans. Neural Netw, vol. 13. 3, pp. 662 – 677, May. 2002.
[13] V. Cherkassky, and F. Mulier, “Learning From Data: Concepts, Theory, and Methods,” New York : Wiley, 1998.
[14] A.K. Jain, M.N. Murty, and P.J. Flynn, “Data clustering: A review,” ACM Comput. Surv, vol. 31. 3, 1999.
[15] L. Rokach, “A survey of Clustering Algorithms,” Data Mining and Knowledge Discovery Handbook, 2nd ed. Springer Science. 10.1007/978-0-387-09823-4_14, 2010 .
[16] Y. Marinakis, M. Marinaki, M. Doumpos, N. Matsatsinis, and C. Zopounidis, “A hybrid stochastic genetic—GRASP algorithm for clustering analysis,” Oper. Res. Int. J.(ORIJ) , vol. 8. 1, 2008.
[17] D. Karaboga, and C. Ozturk, “A novel clustering approach: Artificial Bee Colony (ABC) algorithm,” Applied Soft Computing, Elsevier, 10.1016/j.asoc.12.025, 2009.
[18] C.L. Blake, and C.J. Merz. The University of California at Irvine Repository of Machine, http://www.ics.uci.edu/ mlearn/MLRepository., 1998.
[19] I. De Falco, A. Della Cioppa, and E. Tarantino, “Facing classification problems with Particle Swarm Optimization,” Appl. Soft Comput, vol. 7. 3, pp. 652-658, 2007.
[20] F. Jensen, “An Introduction to Bayesian Networks,” UCL Press/Springer–Verlag, 1996.
[21] D.E. Rumelhart, G.E. Hinton, and R.J. Williams, “Learning representation by backpropagation errors”, Nature, 323(9), pp. 533-536, 1986.
[22] M H. Hassoun, “Fundamentals of Artificial Neural Networks,” The MIT Press, Cambridge, MA, 1995.
[23] J.C. Cleary, and L.E. Trigg, “An instance-based learner using an entropic distance measure,” Proceedings of the 12th International Conference on Machine Learning. pp. 108–114, 1995.
[24] L. Breiman, “Bagging predictors,” Mach. Learn, vol. 24. 2, pp.123-140, 1996.
[25] G.I. Webb, “Multiboosting: a technique for combining boosting and wagging,” Mach. Learn, vol. 40. 2, pp. 159-196, 2000.
[26] R. Kohavi, “Scaling up the accuracy of naive-Bayes classifiers: a decision tree hybrid, in: E. Simoudis, J.W. Han, U. Fayyad (Eds.),” Proceedings of the Second International ConferenceonKnowledge Discovery and Data Mining, AAAI Press. pp. 202–207, 1996.
[27] P. Compton, and R. Jansen, “Knowledge in context: a strategy for expert system maintenance, in: C.J., Barter, M.J., Brooks (Eds.),” Proceedings of Artificial Intelligence LNAI, Berlin, Springer–Verlag, Adelaide, Australia, vol. 406. pp. 292–306, 1988.
[28] G. Demiroz, and A. Guvenir, “Classification by voting feature intervals,” Proceedings of the Seventh European Conference on Machine Learning, pp. 85–92, 1997.
[29] D. Rumelhart, E. Hinton, and J. Williams, Learning internal representation by error propagation, “Parallel Distribute Processing,” vol. 1, pp. 318-362, 1986.
[30] M. B. Menhaj, Principles of Neural Networks, Amirkabir University of Technology, second edition, pp.715, 2002.
[31] R. Omidvar, H. Parvin, and F. Rad, “SSPCO Optimization Algorithm (See-See Partridge Chicks Optimization),” 14 th-Mexican international conferences on artificial intelligence, IEEE, 2015.
[32] Statistical Consultant for Doctoral Students and Researchers, http://www.statisticallysignificantconsulting.com/Ttest.htm.
[33] J. K. Kruschke, “Bayesian estimation supersedes the t test,” Journal of Experimental Psychology: General Version of May 31, 2012.
[34] J. C. F. De. Winter, “Using the Student’s t-test with extremely small sample sizes,” Practical Assessment, Research & Evaluation, vol 18, no 10, 2013.
[35] Y. He, J. Zhou, X. Xiang, H. Chen, and H. Qin, “Comparison of different chaotic maps in particle swarm optimization algorithm for long-term cascaded hydroelectric system scheduling,” Chaos Solitons Fractals 2009;42:3169-76.
[36] L. Coelho, and V. Mariani, “Use of chaotic sequences in a biologically inspired algorithm for engineering design optimization,” Expert Syst Appl 2008;34:1905-13.
[37] H. Gao, Y. Zhang, S. Liang, and D. Li, “A new chaotic algorithm for image encryption,” Chaos Solitons Fractals 2006;29:393-9.
[38] D. Kuo, Chaos and its computing paradigm. IEEE Potentials Mag 2005;24:13-5.
[39] J. Nayak, B. Naik, and H.S. Behera, “Fuzzy C-Means (FCM) lustering algorithm: a decade review from 2000 to 2014,” Comput. Intell. Data Min, vol. 2, pp. 133–149 (2014).
[40] J. Nayak, M. Nanda, K. Nayak, B. Naik, and H.S. Behera, “An improved firefly fuzzy c-means (FAFCM) algorithm for clustering real world data sets,” Smart Innov. Syst. Technol. Vol 27, pp. 339– 348, 2014.
[41] X. Wu,B. Wu, J. Sun, S. Qiu, and X. Li, “A hybrid fuzzy Kharmonic means clustering algorithm,” Appl. Math. Model. vol 39(12), pp. 3398–3409, 2015.
[42] S. Shamshirband, A. Amini, N B. Anuar, L M. Kiah, “D-FICCA: a density-based fuzzy imperialist competitive clustering algorithm for intrusion detection in wireless sensor networks,” Measurement, 55, pp. 212–226, 2014.

LETTERS TO EDITOR

Journal of Electrical and Computer Engineering Innovations (JECEI) welcomes letters to the editor for the post-publication discussions and corrections which allows debate post publication on its site, through the Letters to Editor. Letters pertaining to manuscript published in JECEI should be sent to the editorial office of JECEI within three months of either online publication or before printed publication, except for critiques of original research. Following points are to be considering before sending the letters (comments) to the editor.


[1] Letters that include statements of statistics, facts, research, or theories should include appropriate references, although more than three are discouraged.

[2] Letters that are personal attacks on an author rather than thoughtful criticism of the author’s ideas will not be considered for publication.

[3] Letters can be no more than 300 words in length.

[4] Letter writers should include a statement at the beginning of the letter stating that it is being submitted either for publication or not.

[5] Anonymous letters will not be considered.

[6] Letter writers must include their city and state of residence or work.

[7] Letters will be edited for clarity and length.

CAPTCHA Image