Document Type : Original Research Paper

Authors

1 Faculty of Electrical and Computer Engineering, Babol Noshirvani University of Technology

2 Faculty of Electrical Engineering, Amirkabir University of Technology, Tehran, Iran

Abstract

Automatic analysis of human facial expressions is one of the challenging problems in machine vision systems. It has many applications in human-computer interactions such as, social signal processing, social robots, deceit detection, interactive video and behavior monitoring. In this paper, we develop a new method for automatic facial expression recognition based on facial muscle anatomy and human face structure. The algorithm finds approximate location of effective facial muscles and extracts features by measuring skin texture in 11 local patches. Seven facial expressions, including neutral are being classified in this study using AdaBoost classifier and other classifiers on MMI databases. Experimental results show that analyzing skin texture from selected local patches gives accurate and efficient information in order to identify different facial expressions.

Keywords

[1] A. Mehrabian, Silent Messages: Implicit Communication of Emotions and Attitudes, Wadsworth Pub. Co., 1981.
[2] A. Mehrabian, Communication without words, Psychol. Today 2 (9) ,1968, 52–55.
[3] B. Golomb, T. Sejnowski, “Benefits of machine understanding of facial expression” in NSF Report- Facial expression Understanding, Salt Lack City, UT, 1997, pp 55-71.
[4] M. Pantic, “Face of ambient interface”, in Ambient Intelligence in Everyday Life, vol. 3864, lecture notes on Artificial Intelligence. Berlin, Germany: Springer-Verlag, 2006, pp. 32- 66.
[5] A. Young, Face and Mind, Oxford, U.K., Oxford Univ. Press, 1998.
[6] Shaohua Wan n, J.K.Aggarwal: Spontaneous facial expression recognition: A robust metric learning approach. J. Pattern Recognition, 47, 1859–1868, 2014.
[7] M. Taner Eskil, Kristin S. Benli: Facial expression recognition based on anatomy. J. Computer Vision and Image Understanding, 119, 1–14, 2014.
[8] Yongqiang Li, Shangfei Wang, Yongping Zhao, and Qiang Ji: Simultaneous Facial Feature Tracking and Facial Expression Recognition, IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 22, NO. 7, 2013.
[9] Michel F. Valstar, Maja Pantic,: Fully Automatic Recognition of the Temporal Phases of Facial Actions, IEEE Transactions on systems, man, and cybernetics, vol. 42, no. 1, 2012.
[10] B. Fasel, J. Luettin, Automatic facial expression analyses: a survey, Pattern Recognition, 36, 259-75, 2003.
[11] M. S. Bartlett, G. Littlewort, M. G. Frank, C. Lainscsek, I. Fasel, and J. R. Movellan, “Recognizing facial expression: Machine learning and application to spontaneous behavior,” in Proc. IEEE Conf. Comput. Vis. Pattern Recognit., vol. 2, 568–573, 2005.
[12] M.S. Bartlett, G.C. Littlewort, M.G. Frank, C. Lainscsek, I.R. Fasel, and J.R. Movellan, “Automatic Recognition of Facial Actions in Spontaneous Expressions,” J. Multimedia, vol. 1, no. 6, pp. 22- 35, 2006.
[13] W.Gu, C.Xiang, Y.V.Venkatesh, D.Huang, H.Lin, Facial expression recognition using radial encoding of local Gabor features andc lassifier synthesis,Pattern Recognition, 45, 80– 91, 2012.
[14] G. Zhao, M. Pietikäinen, Boosted multi-resolutions patiotemporal descriptors for facial expression recognition, Pattern Recognition Letters, 30 (no.12), 1117–1127, 2009.
[15] C. Shan, S. Gong, P.W. McOwan, Robust facial expression recognition using local binary patterns, in: Proceedings of the IEEE International Conference on Image Procession, 370-373, 2005.
[16] Yang, P., Liu, Q., Metaxas, D.N., Exploring facial expressions with compositional features. In: IEEE Conference on Computer Vision and Pattern Recognition, 2010.
[17] M. Pantic, L.J. Rothkrantz, Facial action recognition for facial expression analysis from static face images, Trans. Syst. Man Cyber. Part B 34 (3), 1449–1461, 2004.
[18] S. Lucey, I. Matthews, Ch. Hu, Z. Ambadar,: AAM Derived Face Representations for Robust Facial Action Recognition, In Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition, 2006.
[19] Y. Zhang, Q. Ji, Active and dynamic information fusion for facial expression understanding from image sequences, IEEE Trans. Pattern Anal. Mach. Intell. 27 (5), 699–714, 2005.
[20] Hui Fang, N. M. Parthaláin, A. J. Aubrey, G. K.L. Tama, R. Borgo, P. L.Rosin, P. W. Grant, D. Marshall, M. Chen,: Facial expression recognition in dynamic sequences: An integrated approach, Pattern Recognition, 47, 1271–1281, 2014.
[21] T. Ojala, M. Pietikäinen, T. Mäenpää: Multiresolution gray-scale and rotation invariant texture classification with local binary patterns, IEEE Transactions on Pattern Analysis and Machine Intelligence 24 (7), 971–987, 2002.
[22] C. Shan, Sh. Gong, P. W. McOwan: Facial expression recognition based on local binary patterns: a comprehensive study, Image and video computing 27, 803-816, 2009.
[23] Gassner HG, Rafi i A, Young A, Murakami C, Moe K, Larrabee WF (2008) Surgical anatomy of the face. Implications for modern face-lift techniques. Arch Facial Plast Surg 10(1):9–19.
[24] M. J. Fehrenbach, S. W. Herring, Illustrated Anatomy of the Head and Neck, 4th Edition.
[25] W.V. Friesen, P. Ekman, EMFACS-7: Emotional Facial Action Coding System, Unpublished manuscript, University of California at San Francisco, 1983.
[26] P. Viola and M. Jones, “Robust real-time object detection,” Int. J. Comput.Vis.., vol. 57, no. 2, pp. 137–154, 2004. [27] MMI Face Database. Availible online: Http://www.mmifacedb.eu/
[28] M.F. Valstar, M. Pantic, “Induced Disgust, Happiness and Surprise: an Addition to the MMI Facial Expression Database”, Proceedings of the International Language Resources and Evaluation Conference, Malta, May 2010
[29] M. Pantic, M.F. Valstar, R. Rademaker and L. Maat, “Webbased database for facial expression analysis”, Proc. IEEE Int'l Conf. on Multimedia and Expo (ICME'05), Amsterdam, The Netherlands, July 2005

LETTERS TO EDITOR

Journal of Electrical and Computer Engineering Innovations (JECEI) welcomes letters to the editor for the post-publication discussions and corrections which allows debate post publication on its site, through the Letters to Editor. Letters pertaining to manuscript published in JECEI should be sent to the editorial office of JECEI within three months of either online publication or before printed publication, except for critiques of original research. Following points are to be considering before sending the letters (comments) to the editor.


[1] Letters that include statements of statistics, facts, research, or theories should include appropriate references, although more than three are discouraged.

[2] Letters that are personal attacks on an author rather than thoughtful criticism of the author’s ideas will not be considered for publication.

[3] Letters can be no more than 300 words in length.

[4] Letter writers should include a statement at the beginning of the letter stating that it is being submitted either for publication or not.

[5] Anonymous letters will not be considered.

[6] Letter writers must include their city and state of residence or work.

[7] Letters will be edited for clarity and length.

CAPTCHA Image