Human intention detection with facial expressions using video analytics

  • Abstract
  • Keywords
  • References
  • PDF
  • Abstract

    The manuscript should contain an abstract. The abstract should be self-contained and citation-free and should not exceed 200 words. The abstract should state the purpose, approach, results and conclusions of the work.The author should assume that the reader has some knowledge of the subject but has not read the paper. Thus, the abstract should be intelligible and complete in it-self (no numerical references); it should not cite figures, tables, or sections of the paper. The abstract should be written using third person instead of first person.

  • Keywords

    Feature Tracker; Facial Expressions; Support Vector Machines; Emotion Classification.

  • References

      [1] P. Ekman. Emotion in the Human Face. Cambridge University Press, Cambridge, UK, 1982.

      [2] M. Lades, J. C. Vorbru¨ggen, J. Buhmann, J. Lange, C. von der Malsburg, R. P. Wu¨rtz, and W. Konen. Distortion invariant object recognition in the dynamic link

      [3] P. Ekman and W. Friesen. Pictures of Facial Affect. Consulting Psychologists Press, Palo Alto, CA, USA, 1976.

      [4] B. Boser, I. Guyon, and V. Vapnik. A training algorithm for optimal margin classifiers. In D. Haussler, editor, Proceedings of the 5th Annual ACM Workshop on Computational Learning Theory, pages 144–152, 1992.

      [5] N. Cristianini and J. Shawe-Taylor. An Introduction to Support Vector Machines and other Kernel-based Learning Methods. Cambridge University Press, Cambridge, UK, 2000.

      [6] C.-C. Chang and C.-J. Lin. LIBSVM: a library for support vector machines, 2001. Software available at

      [7] O. Chapelle and V. Vapnik. Model selection for support vector machines. In S. Solla, T. Leen, and K.-R. Mu¨ller, editors, Advances in Neural Information Processing Systems, volume 12, pages 230–236. MIT Press, Cambridge, MA, USA, 2000.

      [8] J. F. Cohn, A. J. Zlochower, J. J. Lien, and T. Kanade. Feature-point tracking by optical flow discriminates subtle differences in facial expression. In Proceedings International Conference on Automatic Face and Gesture Recognition, pages 396–401, 1998.

      [9] J. F. Cohn, A. J. Zlochower, J. J. Lien, and T. Kanade. Automated face analysis by feature point tracking has high concurrent validity with manual faces coding. Psychophysiology, 36:35–43, 1999.

      [10] M. N. Dailey, G. W. Cottrell, and R. Adolphs. A six-unit network is all you need to discover happiness. In Proceedings of the Twenty-Second Annual Conference of the Cognitive Science Society, Mahwah, NJ, USA, 2000. Erlbaum.

      [11] C. Darwin. The Expression of the Emotions in Man and Animals. John Murray, London, UK, 1872.

      [12] P. Ekman. Basic emotions. In T. Dalgleish and T. Power, editors, The Handbook of Cognition and Emotion. John Wiley & Sons, Ltd., 1999.

      [13] P. Ekman and W. Friesen. Facial Action Coding System (FACS): Manual. Consulting Psychologists Press, Palo Alto, CA, USA, 1978.

      [14] I. Essa and A. Pentland. Coding, analysis, interpretation and recognition of facial expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(7):757–763, 1997.

      [15] T. Joachims. Text categorization with support vector machines: Learning with many relevant features. In Proceedings of ECML-98, 10th European Conference on Machine Learning, pages 137–142, Heidelberg, DE, 1998. Springer Verlag.

      [16] T. Kanade, J. Cohn, and Y. Tian. Comprehensive database for facial expression analysis. In Proceedings of the 4th IEEE International Conference on Automatic Face and Gesture Recognition (FG’00), pages 46–53, 2000.




Article ID: 10032
DOI: 10.14419/ijet.v7i2.4.10032

Copyright © 2012-2015 Science Publishing Corporation Inc. All rights reserved.